105 resultados para Minimal rationality


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Continuing developments in science and technology mean that the amounts of information forensic scientists are able to provide for criminal investigations is ever increasing. The commensurate increase in complexity creates difficulties for scientists and lawyers with regard to evaluation and interpretation, notably with respect to issues of inference and decision. Probability theory, implemented through graphical methods, and specifically Bayesian networks, provides powerful methods to deal with this complexity. Extensions of these methods to elements of decision theory provide further support and assistance to the judicial system. Bayesian Networks for Probabilistic Inference and Decision Analysis in Forensic Science provides a unique and comprehensive introduction to the use of Bayesian decision networks for the evaluation and interpretation of scientific findings in forensic science, and for the support of decision-makers in their scientific and legal tasks. Includes self-contained introductions to probability and decision theory. Develops the characteristics of Bayesian networks, object-oriented Bayesian networks and their extension to decision models. Features implementation of the methodology with reference to commercial and academically available software. Presents standard networks and their extensions that can be easily implemented and that can assist in the reader's own analysis of real cases. Provides a technique for structuring problems and organizing data based on methods and principles of scientific reasoning. Contains a method for the construction of coherent and defensible arguments for the analysis and evaluation of scientific findings and for decisions based on them. Is written in a lucid style, suitable for forensic scientists and lawyers with minimal mathematical background. Includes a foreword by Ian Evett. The clear and accessible style of this second edition makes this book ideal for all forensic scientists, applied statisticians and graduate students wishing to evaluate forensic findings from the perspective of probability and decision analysis. It will also appeal to lawyers and other scientists and professionals interested in the evaluation and interpretation of forensic findings, including decision making based on scientific information.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The investigation of perceptual and cognitive functions with non-invasive brain imaging methods critically depends on the careful selection of stimuli for use in experiments. For example, it must be verified that any observed effects follow from the parameter of interest (e.g. semantic category) rather than other low-level physical features (e.g. luminance, or spectral properties). Otherwise, interpretation of results is confounded. Often, researchers circumvent this issue by including additional control conditions or tasks, both of which are flawed and also prolong experiments. Here, we present some new approaches for controlling classes of stimuli intended for use in cognitive neuroscience, however these methods can be readily extrapolated to other applications and stimulus modalities. Our approach is comprised of two levels. The first level aims at equalizing individual stimuli in terms of their mean luminance. Each data point in the stimulus is adjusted to a standardized value based on a standard value across the stimulus battery. The second level analyzes two populations of stimuli along their spectral properties (i.e. spatial frequency) using a dissimilarity metric that equals the root mean square of the distance between two populations of objects as a function of spatial frequency along x- and y-dimensions of the image. Randomized permutations are used to obtain a minimal value between the populations to minimize, in a completely data-driven manner, the spectral differences between image sets. While another paper in this issue applies these methods in the case of acoustic stimuli (Aeschlimann et al., Brain Topogr 2008), we illustrate this approach here in detail for complex visual stimuli.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Introduction: The general strategy to perform anti-doping analysis starts with a screening followed by a confirmatory step when a sample is suspected to be positive. The screening step should be fast, generic and able to highlight any sample that may contain a prohibited substance by avoiding false negative and reducing false positive results. The confirmatory step is a dedicated procedure comprising a selective sample preparation and detection mode. Aim: The purpose of the study is to develop rapid screening and selective confirmatory strategies to detect and identify 103 doping agents in urine. Methods: For the screening, urine samples were simply diluted by a factor 2 with ultra-pure water and directly injected ("dilute and shoot") in the ultrahigh- pressure liquid chromatography (UHPLC). The UHPLC separation was performed in two gradients (ESI positive and negative) from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. The gradient analysis time is 9 min including 3 min reequilibration. Analytes detection was performed in full scan mode on a quadrupole time-of-flight (QTOF) mass spectrometer by acquiring the exact mass of the protonated (ESI positive) or deprotonated (ESI negative) molecular ion. For the confirmatory analysis, urine samples were extracted on SPE 96-well plate with mixed-mode cation (MCX) for basic and neutral compounds or anion exchange (MAX) sorbents for acidic molecules. The analytes were eluted in 3 min (including 1.5 min reequilibration) with a S1-25 Ann Toxicol Anal. 2009; 21(S1) Abstracts gradient from 5/95 to 95/5% of MeCN/Water containing 0.1% formic acid. Analytes confirmation was performed in MS and MS/MS mode on a QTOF mass spectrometer. Results: In the screening and confirmatory analysis, basic and neutral analytes were analysed in the positive ESI mode, whereas acidic compounds were analysed in the negative mode. The analyte identification was based on retention time (tR) and exact mass measurement. "Dilute and shoot" was used as a generic sample treatment in the screening procedure, but matrix effect (e.g., ion suppression) cannot be avoided. However, the sensitivity was sufficient for all analytes to reach the minimal required performance limit (MRPL) required by the World Anti Doping Agency (WADA). To avoid time-consuming confirmatory analysis of false positive samples, a pre-confirmatory step was added. It consists of the sample re-injection, the acquisition of MS/MS spectra and the comparison to reference material. For the confirmatory analysis, urine samples were extracted by SPE allowing a pre-concentration of the analyte. A fast chromatographic separation was developed as a single analyte has to be confirmed. A dedicated QTOF-MS and MS/MS acquisition was performed to acquire within the same run a parallel scanning of two functions. Low collision energy was applied in the first channel to obtain the protonated molecular ion (QTOF-MS), while dedicated collision energy was set in the second channel to obtain fragmented ions (QTOF-MS/MS). Enough identification points were obtained to compare the spectra with reference material and negative urine sample. Finally, the entire process was validated and matrix effects quantified. Conclusion: Thanks to the coupling of UHPLC with the QTOF mass spectrometer, high tR repeatability, sensitivity, mass accuracy and mass resolution over a broad mass range were obtained. The method was sensitive, robust and reliable enough to detect and identify doping agents in urine. Keywords: screening, confirmatory analysis, UHPLC, QTOF, doping agents

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Tiwi people of northern Australia have managed natural resources continuously for 6000-8000 years. Tiwi management objectives and outcomes may reflect how they gather information about the environment. We qualitatively analyzed Tiwi documents and management techniques to examine the relation between the social and physical environment of decision makers and their decision-making strategies. We hypothesized that principles of bounded rationality, namely, the use of efficient rules to navigate complex decision problems, explain how Tiwi managers use simple decision strategies (i.e., heuristics) to make robust decisions. Tiwi natural resource managers reduced complexity in decision making through a process that gathers incomplete and uncertain information to quickly guide decisions toward effective outcomes. They used management feedback to validate decisions through an information loop that resulted in long-term sustainability of environmental use. We examined the Tiwi decision-making processes relative to management of barramundi (Lates calcarifer) fisheries and contrasted their management with the state government's management of barramundi. Decisions that enhanced the status of individual people and their attainment of aspiration levels resulted in reliable resource availability for Tiwi consumers. Different decision processes adopted by the state for management of barramundi may not secure similarly sustainable outcomes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

INTRODUCTION: Intravoxel incoherent motion (IVIM) imaging is an MRI perfusion technique that uses a diffusion-weighted sequence with multiple b values and a bi-compartmental signal model to measure the so-called pseudo-diffusion of blood caused by its passage through the microvascular network. The goal of the current study was to assess the feasibility of IVIM perfusion fraction imaging in patients with acute stroke. METHODS: Images were collected in 17 patients with acute stroke. Exclusion criteria were onset of symptoms to imaging >5 days, hemorrhagic transformation, infratentorial lesions, small lesions <0.5 cm in minimal diameter and hemodynamic instability. IVIM imaging was performed at 3 T, using a standard spin-echo Stejskal-Tanner pulsed gradients diffusion-weighted sequence, using 16 b values from 0 to 900 s/mm(2). Image quality was assessed by two radiologists, and quantitative analysis was performed in regions of interest placed in the stroke area, defined by thresholding the apparent diffusion coefficient maps, as well as in the contralateral region. RESULTS: IVIM perfusion fraction maps showed an area of decreased perfusion fraction f in the region of decreased apparent diffusion coefficient. Quantitative analysis showed a statistically significant decrease in both IVIM perfusion fraction f (0.026 ± 0.019 vs. 0.056 ± 0.025, p = 2.2 · 10(-6)) and diffusion coefficient D compared with the contralateral side (3.9 ± 0.79 · 10(-4) vs. 7.5 ± 0.86 · 10(-4) mm(2)/s, p = 1.3 · 10(-20)). CONCLUSION: IVIM perfusion fraction imaging is feasible in acute stroke. IVIM perfusion fraction is significantly reduced in the visible infarct. Further studies should evaluate the potential for IVIM to predict clinical outcome and treatment response.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image quality in magnetic resonance imaging (MRI) is considerably affected by motion. Therefore, motion is one of the most common sources of artifacts in contemporary cardiovascular MRI. Such artifacts in turn may easily lead to misinterpretations in the images and a subsequent loss in diagnostic quality. Hence, there is considerable research interest in strategies that help to overcome these limitations at minimal cost in time, spatial resolution, temporal resolution, and signal-to-noise ratio. This review summarizes and discusses the three principal sources of motion: the beating heart, the breathing lungs, and bulk patient movement. This is followed by a comprehensive overview of commonly used compensation strategies for these different types of motion. Finally, a summary and an outlook are provided.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Several studies have shown that in diabetic patients, the glycemic profile was disturbed after intra-articular injection of corticosteroids. Little is known about the impact of epidural injection in such patients. The goal of this study was double, at first comparing the glycaemic profile in diabetic patients after a unique injection of 80 mg of acetate methylprednisolone either intra-articular or epidural and secondly to compare the amount of systemic diffusion of the drug after both procedures. METHODS: Seventeen patients were included. Glycemic changes were compared in 9 diabetic patients following intra-articular (4 patients) and epidural injections (5 patients). Epidural injections were performed using the sacral route under fluoroscopic control in patients with lumbar spinal stenosis. Diabetes control had to stable for more than 10 days and the renal function to be preserved. Blood glucose was monitored using a validated continuous measuring device (GMS, Medtronic) the day before and for two days following the injection. Results were expressed in the form of daily glycemic profiles and as by mean, peak and minimal values +/ SD. The urinary excretion of methylprednisolone after the 2 routes of injection was analyzed in 8 patients (4 in each group). Urine samples were cropped one hour before the injections, then 4 times during the first day and 3 times a week for 2 weeks. The measurements included the free and conjugated fraction RESULTS: The glycaemic profile remains unchanged with no significant changes in the group of the 5 diabetic patients receiving epidural injections. On the other end, the average peak and mean values were enhanced up to 3 mmol/l above baseline two days after the infiltration in the groups of the 4 diabetic patients infiltrated intra-articular. The mean urinary excretion of the steroid was about ten times higher in the intra-articular versus epidural group: 7000 ng/ml versus 700 ng/ml. Looking at each individual there were marked differences especially after intra-articular injections. CONCLUSION: This is the first study to show that a single epidural steroid injection of 80 mg depot methylprednisolone had no effect on the glycemic control in diabetic patients. The absence of glycemic control changes correlated well with the very low urinary excretion of the drug after epidural injection. Trial registration NCT01420497.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVES: The thermogenic effect of amrinone is unknown and its utilization in patients with severe cardiac failure could potentially increase oxygen requirements and therefore aggravate oxygen debt. Consequently, the present study was undertaken to assess the thermogenic response to amrinone at three different plasma concentrations under controlled conditions and to analyze amrinone's effects on various biochemical variables. DESIGN: A prospective, unblinded, controlled study. The initial control period was followed by three sequential, experimental treatments. SUBJECTS: Ten young, healthy, male volunteers with normal body weight. INTERVENTIONS: Three experimental periods. Amrinone was administered intravenously in progressive doses: a) 0.5 mg/kg followed by 5 micrograms/kg/min; b) 0.5 mg/kg followed by 10 micrograms/kg/min; and c) 1.0 mg/kg followed by 10 micrograms/kg/min. MEASUREMENTS AND MAIN RESULTS: Oxygen consumption (VO2) and CO2 production were continuously measured by means of a computerized indirect calorimeter. At the highest dose, amrinone produced a slight and significant (p < .01) increase in VO2 and in resting metabolic rate (+4.5% and +3.7%, respectively), while no change in CO2 production or in respiratory quotient occurred throughout the study. At the medium and high doses, amrinone increased plasma free fatty acid concentrations by 38% and 53%, respectively (p < .05). No variation in plasma glucose, lactate, insulin, norepinephrine, or epinephrine concentrations was observed during the study. CONCLUSIONS: Amrinone administered intravenously at therapeutic doses has minimal thermogenic and metabolic effects in humans without cardiac failure.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although glycogen (Glyc) is the main carbohydrate storage component, the role of Glyc in the brain during prolonged wakefulness is not clear. The aim of this study was to determine brain Glyc concentration ([]) and turnover time (tau) in euglycemic conscious and undisturbed rats, compared to rats maintained awake for 5h. To measure the metabolism of [1-(13)C]-labeled Glc into Glyc, 23 rats received a [1-(13)C]-labeled Glc solution as drink (10% weight per volume in tap water) ad libitum as their sole source of exogenous carbon for a "labeling period" of either 5h (n=13), 24h (n=5) or 48 h (n=5). Six of the rats labeled for 5h were continuously maintained awake by acoustic, tactile and olfactory stimuli during the labeling period, which resulted in slightly elevated corticosterone levels. Brain [Glyc] measured biochemically after focused microwave fixation in the rats maintained awake (3.9+/-0.2 micromol/g, n=6) was not significantly different from that of the control group (4.0+/-0.1 micromol/g, n=7; t-test, P>0.5). To account for potential variations in plasma Glc isotopic enrichment (IE), Glyc IE was normalized by N-acetyl-aspartate (NAA) IE. A simple mathematical model was developed to derive brain Glyc turnover time as 5.3h with a fit error of 3.2h and NAA turnover time as 15.6h with a fit error of 6.5h, in the control rats. A faster tau(Glyc) (2.9h with a fit error of 1.2h) was estimated in the rats maintained awake for 5h. In conclusion, 5h of prolonged wakefulness mainly activates glycogen metabolism, but has minimal effect on brain [Glyc].

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVEEvaluate whether healthy or diabetic adult mice can tolerate an extreme loss of pancreatic α-cells and how this sudden massive depletion affects β-cell function and blood glucose homeostasis.RESEARCH DESIGN AND METHODSWe generated a new transgenic model allowing near-total α-cell removal specifically in adult mice. Massive α-cell ablation was triggered in normally grown and healthy adult animals upon diphtheria toxin (DT) administration. The metabolic status of these mice was assessed in 1) physiologic conditions, 2) a situation requiring glucagon action, and 3) after β-cell loss.RESULTSAdult transgenic mice enduring extreme (98%) α-cell removal remained healthy and did not display major defects in insulin counter-regulatory response. We observed that 2% of the normal α-cell mass produced enough glucagon to ensure near-normal glucagonemia. β-Cell function and blood glucose homeostasis remained unaltered after α-cell loss, indicating that direct local intraislet signaling between α- and β-cells is dispensable. Escaping α-cells increased their glucagon content during subsequent months, but there was no significant α-cell regeneration. Near-total α-cell ablation did not prevent hyperglycemia in mice having also undergone massive β-cell loss, indicating that a minimal amount of α-cells can still guarantee normal glucagon signaling in diabetic conditions.CONCLUSIONSAn extremely low amount of α-cells is sufficient to prevent a major counter-regulatory deregulation, both under physiologic and diabetic conditions. We previously reported that α-cells reprogram to insulin production after extreme β-cell loss and now conjecture that the low α-cell requirement could be exploited in future diabetic therapies aimed at regenerating β-cells by reprogramming adult α-cells.

Relevância:

10.00% 10.00%

Publicador:

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A method of objectively determining imaging performance for a mammography quality assurance programme for digital systems was developed. The method is based on the assessment of the visibility of a spherical microcalcification of 0.2 mm using a quasi-ideal observer model. It requires the assessment of the spatial resolution (modulation transfer function) and the noise power spectra of the systems. The contrast is measured using a 0.2-mm thick Al sheet and Polymethylmethacrylate (PMMA) blocks. The minimal image quality was defined as that giving a target contrast-to-noise ratio (CNR) of 5.4. Several evaluations of this objective method for evaluating image quality in mammography quality assurance programmes have been considered on computed radiography (CR) and digital radiography (DR) mammography systems. The measurement gives a threshold CNR necessary to reach the minimum standard image quality required with regards to the visibility of a 0.2-mm microcalcification. This method may replace the CDMAM image evaluation and simplify the threshold contrast visibility test used in mammography quality.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Indirect calorimetry based on respiratory exchange measurement has been successfully used from the beginning of the century to obtain an estimate of heat production (energy expenditure) in human subjects and animals. The errors inherent to this classical technique can stem from various sources: 1) model of calculation and assumptions, 2) calorimetric factors used, 3) technical factors and 4) human factors. The physiological and biochemical factors influencing the interpretation of calorimetric data include a change in the size of the bicarbonate and urea pools and the accumulation or loss (via breath, urine or sweat) of intermediary metabolites (gluconeogenesis, ketogenesis). More recently, respiratory gas exchange data have been used to estimate substrate utilization rates in various physiological and metabolic situations (fasting, post-prandial state, etc.). It should be recalled that indirect calorimetry provides an index of overall substrate disappearance rates. This is incorrectly assumed to be equivalent to substrate "oxidation" rates. Unfortunately, there is no adequate golden standard to validate whole body substrate "oxidation" rates, and this contrasts to the "validation" of heat production by indirect calorimetry, through use of direct calorimetry under strict thermal equilibrium conditions. Tracer techniques using stable (or radioactive) isotopes, represent an independent way of assessing substrate utilization rates. When carbohydrate metabolism is measured with both techniques, indirect calorimetry generally provides consistent glucose "oxidation" rates as compared to isotopic tracers, but only when certain metabolic processes (such as gluconeogenesis and lipogenesis) are minimal or / and when the respiratory quotients are not at the extreme of the physiological range. However, it is believed that the tracer techniques underestimate true glucose "oxidation" rates due to the failure to account for glycogenolysis in the tissue storing glucose, since this escapes the systemic circulation. A major advantage of isotopic techniques is that they are able to estimate (given certain assumptions) various metabolic processes (such as gluconeogenesis) in a noninvasive way. Furthermore when, in addition to the 3 macronutrients, a fourth substrate is administered (such as ethanol), isotopic quantification of substrate "oxidation" allows one to eliminate the inherent assumptions made by indirect calorimetry. In conclusion, isotopic tracers techniques and indirect calorimetry should be considered as complementary techniques, in particular since the tracer techniques require the measurement of carbon dioxide production obtained by indirect calorimetry. However, it should be kept in mind that the assessment of substrate oxidation by indirect calorimetry may involve large errors in particular over a short period of time. By indirect calorimetry, energy expenditure (heat production) is calculated with substantially less error than substrate oxidation rates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of the study was to determine the influence of the dissection of the palate during primary surgery and the type of orthognathic surgery needed in cases of unilateral total cleft. The review concerns 58 children born with a complete unilateral cleft lip and palate and treated between 1994 and 2008 at the appropriate age for orthognathic surgery. This is a retrospective mixed-longitudinal study. Patients with syndromes or associated anomalies were excluded. All children were treated by the same orthodontist and by the same surgical team. Children are divided into 2 groups: the first group includes children who had conventional primary cleft palate repair during their first year of life, with extensive mucoperiosteal undermining. The second group includes children operated on according to the Malek surgical protocol. The soft palate is closed at the age of 3 months, and the hard palate at 6 months with minimal mucoperiosteal undermining. Lateral cephalograms at ages 9 and 16 years and surgical records were compared. The need for orthognathic surgery was more frequent in the first than in the second group (60% vs 47.8%). Concerning the type of orthognathic surgery performed, 2- or 3-piece Le Fort I or bimaxillary osteotomies were also less required in the first group. Palate surgery following the Malek procedure results in an improved and simplified craniofacial outcome. With a minimal undermining of palatal mucosa, we managed to reduce the amount of patients who required an orthognathic procedure. When this procedure was indicated, the surgical intervention was also greatly simplified.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The assessment of medical technologies has to answer several questions ranging from safety and effectiveness to complex economical, social, and health policy issues. The type of data needed to carry out such evaluation depends on the specific questions to be answered, as well as on the stage of development of a technology. Basically two types of data may be distinguished: (a) general demographic, administrative, or financial data which has been collected not specifically for technology assessment; (b) the data collected with respect either to a specific technology or to a disease or medical problem. On the basis of a pilot inquiry in Europe and bibliographic research, the following categories of type (b) data bases have been identified: registries, clinical data bases, banks of factual and bibliographic knowledge, and expert systems. Examples of each category are discussed briefly. The following aims for further research and practical goals are proposed: criteria for the minimal data set required, improvement to the registries and clinical data banks, and development of an international clearinghouse to enhance information diffusion on both existing data bases and available reports on medical technology assessments.