914 resultados para Spinal injury, Classification system, Severity measure, Treatment algorithm, Methodological review


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To analyze and define the possible errors that may be introduced in keratoconus classification when the keratometric corneal power is used in such classification. Materials and methods: Retrospective study including a total of 44 keratoconus eyes. A comprehensive ophthalmologic examination was performed in all cases, which included a corneal analysis with the Pentacam system (Oculus). Classical keratometric corneal power (Pk), Gaussian corneal power (Pc Gauss), True Net Power (TNP) (Gaussian power neglecting the corneal thickness effect), and an adjusted keratometric corneal power (Pkadj) (keratometric power considering a variable keratometric index) were calculated. All cases included in the study were classified according to five different classification systems: Alió-Shabayek, Amsler-Krumeich, Rabinowitz-McDonnell, collaborative longitudinal evaluation of keratoconus (CLEK), and McMahon. Results: When Pk and Pkadj were compared, differences in the type of grading of keratoconus cases was found in 13.6% of eyes when the Alió-Shabayek or the Amsler-Krumeich systems were used. Likewise, grading differences were observed in 22.7% of eyes with the Rabinowitz-McDonnell and McMahon classification systems and in 31.8% of eyes with the CLEK classification system. All reclassified cases using Pkadj were done in a less severe stage, indicating that the use of Pk may lead to the classification of a cornea as keratoconus, being normal. In general, the results obtained using Pkadj, Pc Gauss or the TNP were equivalent. Differences between Pkadj and Pc Gauss were within ± 0.7D. Conclusion: The use of classical keratometric corneal power may lead to incorrect grading of the severity of keratoconus, with a trend to a more severe grading.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

%WL: porcentaje de pérdida de peso; %FL: porcentaje de pérdida de grasa. Objetivo: describir la metodología en la gestión del tratamiento dietético cuali-cuantitativo en el sobrepeso y la obesidad. Método: se llevaron a cabo 4.625 consultas con 616 pacientes mayores de 25 años con sobrepeso y obesidad, en el sur-este de España, durante los años 2006-12. Se utilizó una dieta equilibrada, hipocalórica, cuali-cuantitativa con alimentos de la zona. Se describió la metodología del tratamiento dietético en la pérdida y el mantenimiento. Se trató qué unidades de medida son adecuadas para la expresión del éxito en la pérdida; una nueva visión del asesoramiento individualizado y el tratamiento multidisciplinar. Resultados: el 80% de los pacientes obtienen un %FL ≥ 5% (22,6 ± 11,8 - 11,2 ± 7,4), y asistieron a la consulta más de mes y medio. Conclusión: se presenta un esquema de la metodología del tratamiento dietético; se han recomendado las unidades de medidas a utilizar en consulta y en la publicación de ensayos clínicos, creando un precedente con grado de evidencia de cómo determinar la pérdida exitosa; se recomienda medir los perímetros de cadera y cintura, e incorporar el estudio de la imagen corporal; se presenta una nueva visión del asesoramiento individualizado y una atención multidisciplinar avanzada, independiente de la edad, el estado de gestación y las minusvalías físicas. La postura del sanitario debe de ser considerada como el gestor encargado de determinar qué técnicas podrían ser más efectivas en dicha pérdida.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The construction industry is characterised by fragmentation and suffers from lack of collaboration, often adopting adversarial working practices to achieve deliverables. For the UK Government and construction industry, BIM is a game changer aiming to rectify this fragmentation and promote collaboration. However it has become clear that there is an essential need to have better controls and definitions of both data deliverables and data classification. Traditional methods and techniques for collating and inputting data have shown to be time consuming and provide little to improve or add value to the overall task of improving deliverables. Hence arose the need in the industry to develop a Digital Plan of Work (DPoW) toolkit that would aid the decision making process, providing the required control over the project workflows and data deliverables, and enabling better collaboration through transparency of need and delivery. The specification for the existing Digital Plan of Work (DPoW) was to be, an industry standard method of describing geometric, requirements and data deliveries at key stages of the project cycle, with the addition of a structured and standardised information classification system. However surveys and interviews conducted within this research indicate that the current DPoW resembles a digitised version of the pre-existing plans of work and does not push towards the data enriched decision-making abilities that advancements in technology now offer. A Digital Framework is not simply the digitisation of current or historic standard methods and procedures, it is a new intelligent driven digital system that uses new tools, processes, procedures and work flows to eradicate waste and increase efficiency. In addition to reporting on conducted surveys above, this research paper will present a theoretical investigation into usage of Intelligent Decision Support Systems within a digital plan of work framework. Furthermore this paper will present findings on the suitability to utilise advancements in intelligent decision-making system frameworks and Artificial Intelligence for a UK BIM Framework. This should form the foundations of decision-making for projects implemented at BIM level 2. The gap identified in this paper is that the current digital toolkit does not incorporate the intelligent characteristics available in other industries through advancements in technology and collation of vast amounts of data that a digital plan of work framework could have access to and begin to develop, learn and adapt for decision-making through the live interaction of project stakeholders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Treatment of patients with severe liver dysfunction including hyperbilirubinemia secondary to liver metastases of gastrointestinal (GI) cancer is challenging. Regimen of oxaliplatin and fluoropyrimidine (FP)/folinic acid (FA) ± a monoclonal antibody (moAb), represents a feasible option considering the pharmacokinetics. Clinical data on the respective dosage and tolerability are limited and no recommendations are available. METHODS: Consecutive patients with severe hyperbilirubinemia [>2 × upper limit of the normal range (ULN) and >2.4 mg/dl] due to liver metastases of GI cancer without options for drainage receiving oxaliplatin, FP/FA ± moAb were analyzed. To collect further data a review of the literature was performed. RESULTS: A total of 12 patients were identified between 2011 and 2015. At treatment start, median bilirubin level was 6.1 mg/dl (>5 × ULN, range 2.7-13.6). The majority of patients (n = 11) received dose-reduced regimen with oxaliplatin (60-76%) and FP/FA (0-77%), rapidly escalating to full dose regimen. During treatment, bilirubin levels dropped more than 50% within 8 weeks or normalized within 12 weeks in 6 patients (responders). Median overall survival was 5.75 months (range 1.0-16.0 months) but was significantly prolonged in responders compared to nonresponders [9.7 and 3.0 months, p = 0.026 (two-sided test); 95% confidence interval (CI): 1.10-10.22]. In addition, case reports or series comprising a further 26 patients could be identified. Based on the obtained data a treatment algorithm was developed. CONCLUSION: Treatment with oxaliplatin, FP/FA ± moAb is feasible and may derive relevant benefits in patients with severe liver dysfunction caused by GI cancer liver metastases without further options of drainage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE This study is a prospective, controlled clinical and electrophysiologic trial examining the chronic course of posttraumatic sleep-wake disturbances (SWD). METHODS We screened 140 patients with acute, first-ever traumatic brain injury of any severity and included 60 patients for prospective follow-up examinations. Patients with prior brain trauma, other neurologic or systemic disease, drug abuse, or psychiatric comorbidities were excluded. Eighteen months after trauma, we performed detailed sleep assessment in 31 participants. As a control group, we enrolled healthy individuals without prior brain trauma matched for age, sex, and sleep satiation. RESULTS In the chronic state after traumatic brain injury, sleep need per 24 hours was persistently increased in trauma patients (8.1 ± 0.5 hours) as compared to healthy controls (7.1 ± 0.7 hours). The prevalence of chronic objective excessive daytime sleepiness was 67% in patients with brain trauma compared to 19% in controls. Patients significantly underestimated excessive daytime sleepiness and sleep need, emphasizing the unreliability of self-assessments on SWD in trauma patients. CONCLUSIONS This study provides prospective, controlled, and objective evidence for chronic persistence of posttraumatic SWD, which remain underestimated by patients. These results have clinical and medicolegal implications given that SWD can exacerbate other outcomes of traumatic brain injury, impair quality of life, and are associated with public safety hazards.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The prevalence of dementia is growing in developed countries where elderly patients are increasing in numbers. Neurotransmission modulation is one approach to the treatment of dementia. Cholinergic precursors, anticholinesterases, nicotine receptor agonists and muscarinic M-2 receptor antagonists are agents that enhance cholinergic neurotransmission and that depend on having some intact cholinergic innervation to be effective in the treatment of dementia. The cholinergic precursor choline alfoscerate may be emerging as a potential useful drug in the treatment of dementia, with few adverse effects. Of the anticholinesterases, donepezil, in addition to having a similar efficacy to tacrine in mild-to-moderate Alzheimer's disease (AD), appears to have major advantages; its use is associated with lower drop-out rates in clinical trials, a lower incidence of cholinergic-like side effects and no liver toxicity. Rivastigmine is efficacious in the treatment in dementia with Lewy bodies, a condition in which the other anticholinesterases have not been tested extensively to date. Galantamine is an anticholinesterase and also acts as an allosteric potentiating modulator at nicotinic receptors to increase the release of acetylcholine. Pooled data from clinical trials of patients with mild-to-moderate AD suggest that the benefits and safety profile of galantamine are similar to those of the anticholinesterases. Selective nicotine receptor agonists are being developed that enhance cognitive performance without influencing autonomic and skeletal muscle function, but these have not yet entered clinical trial for dementia. Unlike the cholinergic enhancers, the M, receptor agonists do not depend upon intact cholinergic nerves but on intact M, receptors for their action, which are mainly preserved in AD and dementia with Lewy bodies. The M, receptor-selective agonists developed to date have shown limited efficacy in clinical trials and have a high incidence of side effects. A major recent advancement in the treatment of dementia is memantine, a non-competitive antagonist at NMDA receptors. Memantine is beneficial in the treatment of severe and moderate to-severe AD and may also be of some benefit in the treatment of mild-to-moderate vascular dementia. Drugs that modulate 5-HT, somatostatin and noradrenergic neurotransmission are also being considered for the treatment of dementia.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study Design. Experimental study of muscle changes after lumbar spinal injury. Objectives. To investigate effects of intervertebral disc and nerve root lesions on cross-sectional area, histology and chemistry of porcine lumbar multifidus. Summary of Background Data. The multifidus cross-sectional area is reduced in acute and chronic low back pain. Although chronic changes are widespread, acute changes at 1 segment are identified within days of injury. It is uncertain whether changes precede or follow injury, or what is the mechanism. Methods. The multifidus cross-sectional area was measured in 21 pigs from L1 to S1 with ultrasound before and 3 or 6 days after lesions: incision into L3 - L4 disc, medial branch transection of the L3 dorsal ramus, and a sham procedure. Samples from L3 to L5 were studied histologically and chemically. Results. The multifidus cross-sectional area was reduced at L4 ipsilateral to disc lesion but at L4 - L6 after nerve lesion. There was no change after sham or on the opposite side. Water and lactate were reduced bilaterally after disc lesion and ipsilateral to nerve lesion. Histology revealed enlargement of adipocytes and clustering of myofibers at multiple levels after disc and nerve lesions. Conclusions. These data resolve the controversy that the multifidus cross-sectional area reduces rapidly after lumbar injury. Changes after disc lesion affect 1 level with a different distribution to denervation. Such changes may be due to disuse following reflex inhibitory mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Foreign exchange trading has emerged in recent times as a significant activity in many countries. As with most forms of trading, the activity is influenced by many random parameters so that the creation of a system that effectively emulates the trading process is very helpful. In this paper, we try to create such a system with a genetic algorithm engine to emulate trader behaviour on the foreign exchange market and to find the most profitable trading strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Case linkage, the linking of crimes into series, is used in policing in the UK and other countries. Previous researchers have proposed using rapists' speech in this practice; however, researching this application requires the development of a reliable coding system for rapists' speech. A system was developed based on linguistic theories of pragmatics which allowed for the categorization of an utterance into a speech act type (e.g. directive). Following this classification, the qualitative properties of the utterances (e.g. the degree of threat it carried) could be captured through the use of rating scales. This system was tested against a previously developed system using 188 rapists' utterances taken from victims' descriptions of rape. The pragmatics-based system demonstrated higher inter-rater reliability whilst enabling the classification of a greater number of rapists' utterances. Inter-rater reliability for the subscales was also tested using a sub-sample of 50 rapists' utterances and inter-item correlations were calculated. Seventy-six per cent of the subscales had satisfactory to high inter-rater reliability. Based on these findings and the inter-item correlations, the classification system was revised. The potential use of this system for the practices of case linkage and offender profiling is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Intensive Care Unit (ICU) being one of those vital areas of a hospital providing clinical care, the quality of service rendered must be monitored and measured quantitatively. It is, therefore, essential to know the performance of an ICU, in order to identify any deficits and enable the service providers to improve the quality of service. Although there have been many attempts to do this with the help of illness severity scoring systems, the relative lack of success using these methods has led to the search for a form of measurement, which would encompass all the different aspects of an ICU in a holistic manner. The Analytic Hierarchy Process (AHP), a multiple-attribute, decision-making technique is utilised in this study to evolve a system to measure the performance of ICU services reliably. This tool has been applied to a surgical ICU in Barbados; we recommend AHP as a valuable tool to quantify the performance of an ICU. Copyright © 2004 Inderscience Enterprises Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Task classification is introduced as a method for the evaluation of monitoring behaviour in different task situations. On the basis of an analysis of different monitoring tasks, a task classification system comprising four task 'dimensions' is proposed. The perceptual speed and flexibility of closure categories, which are identified with signal discrimination type, comprise the principal dimension in this taxonomy, the others being sense modality, the time course of events, and source complexity. It is also proposed that decision theory provides the most complete method for the analysis of performance in monitoring tasks. Several different aspects of decision theory in relation to monitoring behaviour are described. A method is also outlined whereby both accuracy and latency measures of performance may be analysed within the same decision theory framework. Eight experiments and an organizational study are reported. The results show that a distinction can be made between the perceptual efficiency (sensitivity) of a monitor and his criterial level of response, and that in most monitoring situations, there is no decrement in efficiency over the work period, but an increase in the strictness of the response criterion. The range of tasks exhibiting either or both of these performance trends can be specified within the task classification system. In particular, it is shown that a sensitivity decrement is only obtained for 'speed' tasks with a high stimulation rate. A distinctive feature of 'speed' tasks is that target detection requires the discrimination of a change in a stimulus relative to preceding stimuli, whereas in 'closure' tasks, the information required for the discrimination of targets is presented at the same point In time. In the final study, the specification of tasks yielding sensitivity decrements is shown to be consistent with a task classification analysis of the monitoring literature. It is also demonstrated that the signal type dimension has a major influence on the consistency of individual differences in performance in different tasks. The results provide an empirical validation for the 'speed' and 'closure' categories, and suggest that individual differences are not completely task specific but are dependent on the demands common to different tasks. Task classification is therefore shovn to enable improved generalizations to be made of the factors affecting 1) performance trends over time, and 2) the consistencv of performance in different tasks. A decision theory analysis of response latencies is shown to support the view that criterion shifts are obtained in some tasks, while sensitivity shifts are obtained in others. The results of a psychophysiological study also suggest that evoked potential latency measures may provide temporal correlates of criterion shifts in monitoring tasks. Among other results, the finding that the latencies of negative responses do not increase over time is taken to invalidate arousal-based theories of performance trends over a work period. An interpretation in terms of expectancy, however, provides a more reliable explanation of criterion shifts. Although the mechanisms underlying the sensitivity decrement are not completely clear, the results rule out 'unitary' theories such as observing response and coupling theory. It is suggested that an interpretation in terms of the memory data limitations on information processing provides the most parsimonious explanation of all the results in the literature relating to sensitivity decrement. Task classification therefore enables the refinement and selection of theories of monitoring behaviour in terms of their reliability in generalizing predictions to a wide range of tasks. It is thus concluded that task classification and decision theory provide a reliable basis for the assessment and analysis of monitoring behaviour in different task situations.