943 resultados para Non-parametric Tests


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Amphisbaena nigricauda Gans, 1966 is a small, poorly known amphisbaenid endemic to the restinga of the states of Espírito Santo and Bahia, Brazil. We analyze 178 specimens collected in Vitória municipality, state of Espírito Santo, Brazil, to investigate whether this species show sexual dimorphism in pre-cloacal pores and in morphological characters. Sex was determined by a ventral incision and direct inspection of gonads. A PCA analysis was performed to generate a general body size measurement. A T test and the non-parametric Mann-Whitney test were used to assess whether this species show sexual dimorphism on five morphometric and five meristic characters, respectively. Sex could not be determined in 36 specimens because they were mutilated in the posterior portion of their bodies. The diagnosis of the species is redefined based on this sample size: the smallest number of body annuli changes from 222 to 192, the number of dorsal and ventral segments in an annulus in the middle of the body changes to 9-11/13-16 (instead of 10/16), and the autotomic tail annulus lies between annulus 7-10 (instead of 6-9). The number of tail annuli remained within the known range of variation of the species (19-24). None of the 80 females analyzed showed pre-cloacal pores, whereas within males 59 out of 62 specimens displayed four and two specimens displayed five pre-cloacal pores. A single male did not possess pre-cloacal pores, but showed irregular scales on its cloacal region. Sex-based difference based on presence or absence of pre-cloacal pores as well as males with wider head was seen in other Neotropical amphisbaenids. However, a pattern of body size differences between males and females has not been identified so far in the few amphisbaenid species studied in this regard. Further studies on this taxonomic group are still needed to elucidate the existence of general patterns of sexual dimorphism and to identify the selective pressures driving these patterns.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Inductive learning aims at finding general rules that hold true in a database. Targeted learning seeks rules for the predictions of the value of a variable based on the values of others, as in the case of linear or non-parametric regression analysis. Non-targeted learning finds regularities without a specific prediction goal. We model the product of non-targeted learning as rules that state that a certain phenomenon never happens, or that certain conditions necessitate another. For all types of rules, there is a trade-off between the rule's accuracy and its simplicity. Thus rule selection can be viewed as a choice problem, among pairs of degree of accuracy and degree of complexity. However, one cannot in general tell what is the feasible set in the accuracy-complexity space. Formally, we show that finding out whether a point belongs to this set is computationally hard. In particular, in the context of linear regression, finding a small set of variables that obtain a certain value of R2 is computationally hard. Computational complexity may explain why a person is not always aware of rules that, if asked, she would find valid. This, in turn, may explain why one can change other people's minds (opinions, beliefs) without providing new information.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare settings. Each software tool must therefore be regarded with respect to the individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Computer-assisted TDM is gaining growing interest and should further improve, especially in terms of information system interfacing, user friendliness, data storage capability and report generation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Public authorities and road users alike are increasingly concerned by recent trends in road safety outcomes in Barcelona, which is the European city with the highest number of registered Powered Two-Wheel (PTW) vehicles per inhabitant,. In this study we explore the determinants of motorcycle and moped accident severity in a large urban area, drawing on Barcelona’s local police database (2002-2008). We apply non-parametric regression techniques to characterize PTW accidents and parametric methods to investigate the factors influencing their severity. Our results show that PTW accident victims are more vulnerable, showing greater degrees of accident severity, than other traffic victims. Speed violations and alcohol consumption provide the worst health outcomes. Demographic and environment-related risk factors, in addition to helmet use, play an important role in determining accident severity. Thus, this study furthers our understanding of the most vulnerable vehicle types, while our results have direct implications for local policy makers in their fight to reduce the severity of PTW accidents in large urban areas.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Traditionally, it is assumed that the population size of cities in a country follows a Pareto distribution. This assumption is typically supported by nding evidence of Zipf's Law. Recent studies question this nding, highlighting that, while the Pareto distribution may t reasonably well when the data is truncated at the upper tail, i.e. for the largest cities of a country, the log-normal distribution may apply when all cities are considered. Moreover, conclusions may be sensitive to the choice of a particular truncation threshold, a yet overlooked issue in the literature. In this paper, then, we reassess the city size distribution in relation to its sensitivity to the choice of truncation point. In particular, we look at US Census data and apply a recursive-truncation approach to estimate Zipf's Law and a non-parametric alternative test where we consider each possible truncation point of the distribution of all cities. Results con rm the sensitivity of results to the truncation point. Moreover, repeating the analysis over simulated data con rms the di culty of distinguishing a Pareto tail from the tail of a log-normal and, in turn, identifying the city size distribution as a false or a weak Pareto law.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Our objective is to analyse fraud as an operational risk for the insurance company. We study the effect of a fraud detection policy on the insurer's results account, quantifying the loss risk from the perspective of claims auditing. From the point of view of operational risk, the study aims to analyse the effect of failing to detect fraudulent claims after investigation. We have chosen VAR as the risk measure with a non-parametric estimation of the loss risk involved in the detection or non-detection of fraudulent claims. The most relevant conclusion is that auditing claims reduces loss risk in the insurance company.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The investigation of unexplained syncope remains a challenging clinical problem. In the present study we sought to evaluate the diagnostic value of a standardized work-up focusing on non invasive tests in patients with unexplained syncope referred to a syncope clinic, and whether certain combinations of clinical parameters are characteristic of rhythmic and reflex causes of syncope. METHODS AND RESULTS: 317 consecutive patients underwent a standardized work-up including a 12-lead ECG, physical examination, detailed history with screening for syncope-related symptoms using a structured questionnaire followed by carotid sinus massage (CSM), and head-up tilt test. Invasive testings including an electrophysiological study and implantation of a loop recorder were only performed in those with structural heart disease or traumatic syncope. Our work-up identified an etiology in 81% of the patients. Importantly, three quarters of the causes were established non invasively combining head-up tilt test, CSM and hyperventilation testing. Invasive tests yielded an additional 7% of diagnoses. Logistic analysis identified age and number of significant prodromes as the only predictive factors of rhythmic syncope. The same two factors, in addition to the duration of the ECG P-wave, were also predictive of vasovagal and psychogenic syncope. These factors, optimally combined in predictive models, showed a high negative and a modest positive predictive value. CONCLUSION: A standardized work-up focusing on non invasive tests allows to establish more than three quarters of syncope causes. Predictive models based on simple clinical parameters may help to distinguish between rhythmic and other causes of syncope

Relevância:

80.00% 80.00%

Publicador:

Resumo:

OBJECTIVES: To analyse the prevalence of lifetime recourse to prostitution (LRP) among men in the general population of Switzerland from a trend and cohort perspective. METHODS: Using nine repeated representative cross-sectional surveys from 1987 to 2000, age-specific estimates of LRP were computed. Trends and period effect were analysed as the evolution of cross-sectional population estimates within age groups and overall. Cohort analysis relied on cohorts constructed from the 1989 survey and followed in subsequent waves. Age and cohort effects were modelled using logistic regression and non-parametric monotone regression. RESULTS: Whereas prevalence for the younger groups was found to be logically lower, there was no consistent increasing or decreasing trend over the years; there was no significant period effect. For the 17-30 year age group, the mean estimate over 1987-2000 was 11.5% (range 8.3 to 12.7%); for the 31-45 year group, the mean was 21.5% (range over 1989-2000 20.3 to 23.0%). Regarding cohort analysis, the prevalence of LRP was found to increase steeply in the youngest ages before reaching a plateau near the age of 40 years. At the age of 43 years, the prevalence was estimated to be 22.6% (95% CI 21.1% to 24.1%). CONCLUSIONS: The steep increase in the cohort-wise prevalence of LRP in younger ages calls for a concentration of prevention activities in young people. If the plateauing at approximately 40 years of age is not followed by a further increase later in life, which is not known, then consumers of paid sex would be repeat buyers only, a fact that should be taken into account by prevention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The involvement of the cerebellum in migraine pathophysiology is not well understood. We used a biparametric approach at high-field MRI (3 T) to assess the structural integrity of the cerebellum in 15 migraineurs with aura (MWA), 23 migraineurs without aura (MWoA), and 20 healthy controls (HC). High-resolution T1 relaxation maps were acquired together with magnetization transfer images in order to probe microstructural and myelin integrity. Clusterwise analysis was performed on T1 and magnetization transfer ratio (MTR) maps of the cerebellum of MWA, MWoA, and HC using an ANOVA and a non-parametric clusterwise permutation F test, with age and gender as covariates and correction for familywise error rate. In addition, mean MTR and T1 in frontal regions known to be highly connected to the cerebellum were computed. Clusterwise comparison among groups showed a cluster of lower MTR in the right Crus I of MWoA patients vs. HC and MWA subjects (p = 0.04). Univariate and bivariate analysis on T1 and MTR contrasts showed that MWoA patients had longer T1 and lower MTR in the right and left pars orbitalis compared to MWA (p < 0.01 and 0.05, respectively), but no differences were found with HC. Lower MTR and longer T1 point at a loss of macromolecules and/or micro-edema in Crus I and pars orbitalis in MWoA patients vs. HC and vs. MWA. The pathophysiological implications of these findings are discussed in light of recent literature.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In occupational exposure assessment of airborne contaminants, exposure levels can either be estimated through repeated measurements of the pollutant concentration in air, expert judgment or through exposure models that use information on the conditions of exposure as input. In this report, we propose an empirical hierarchical Bayesian model to unify these approaches. Prior to any measurement, the hygienist conducts an assessment to generate prior distributions of exposure determinants. Monte-Carlo samples from these distributions feed two level-2 models: a physical, two-compartment model, and a non-parametric, neural network model trained with existing exposure data. The outputs of these two models are weighted according to the expert's assessment of their relevance to yield predictive distributions of the long-term geometric mean and geometric standard deviation of the worker's exposure profile (level-1 model). Bayesian inferences are then drawn iteratively from subsequent measurements of worker exposure. Any traditional decision strategy based on a comparison with occupational exposure limits (e.g. mean exposure, exceedance strategies) can then be applied. Data on 82 workers exposed to 18 contaminants in 14 companies were used to validate the model with cross-validation techniques. A user-friendly program running the model is available upon request.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: As part of the second generation surveillance system for HIV/Aids in Switzerland, repeated cross-sectional surveys were conducted in 1993, 1994, 1996, 2000, 2006 and 2011 among attenders of all low threshold facilities (LTFs) with needle exchange programmes and/or supervised drug consumption rooms for injection or inhalation. The number of syringes distributed to the injectors has also been measured annually since 2000. Distribution in other settings, such as pharmacies, is also monitored nationally. Methods: Periodic surveys of LTFs have been conducted using an interviewer/self-administered questionnaire structured along four themes: socio-demographic characteristics, drug consumption, risk/preventive behaviour and health. Analysis is restricted to attenders who had injected drugs during their lifetime (IDU´s). Pearson's chi-square test and trend analysis were conducted on annual aggregated data. Trend significance was assessed using Stata's non parametric test nptrend. Results: Median age of IDU´s increased from 26 years in 1993 to 40 in 2011; most are men (78%). Total yearly number of syringes distributed by LTFs has decreased by 44% in 10 years. Use of cocaine has increased (Table 1). Injection, regular use of heroin and borrowing of syringes/needles have decreased, while sharing of other material remains stable. There are fewer new injectors; more IDU´s report substitution treatment. Most attenders had ever been tested for HIV (90% in 1993, 94% in 2011). Reported prevalence of HIV remained stable around 10%; that of HCV decreased from 62% in 2000 to 42% in 2011. Conclusions: Overall, findings indicate a decrease in injection as a means of drug consumption in that population. This interpretation is supported by data from other sources, such as a national decrease in distribution from other delivery points. Switzerland's behavioural surveillance system is sustainable and allows the HIV epidemic to be monitored among this hard-to-reach population, providing information for planning and evaluation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Introduction: The Fragile X - associated Tremor Ataxia Syndrome (FXTAS) is a recently described, and under-diagnosed, late onset (≈ 60y) neurodegenerative disorder affecting male carriers of a premutation in the Fragile X Mental Retardation 1 (FMR1) gene. The premutation is an CGG (Cytosine-Guanine-Guanine) expansion (55 to 200 CGG repeats) in the proximal region of the FMR1 gene. Patients with FXTAS primarily present with cerebellar ataxia and intention tremor. Neuroradiological features of FXTAS include prominent white matter disease in the periventricular, subcortical, middle cerebellar peduncles and deep white matter of the cerebellum on T2-weighted or FLAIR MR imaging (Jacquemmont 2007, Loesch 2007, Brunberg 2002, Cohen 2006). We hypothesize that a significant white matter alteration is present in younger individuals many years prior to clinical symptoms and/or the presence of visible lesions on conventional MR sequences and might be detectable by magnetization transfer (MT) imaging. Methods: Eleven asymptomatic premutation carriers (mean age = 55 years) and seven intra-familial controls participated to the study. A standardized neurological examination was performed on all participants and a neuropsychological evaluation was carried out before MR scanning performed on a 3T Siemens Trio. The protocol included a sagittal T1-weighted 3D gradient-echo sequence (MPRAGE, 160 slices, 1 mm^3 isotropic voxels) and a gradient-echo MTI (FA 30, TE 15, matrix size 256*256, pixel size 1*1 mm, 36 slices (thickness 2mm), MT pulse duration 7.68 ms, FA 500, frequency offset 1.5 kHz). MTI was performed by acquiring consecutively two set of images; first with and then without the MT saturation pulse. MT images were coregistered to the T1 acquisition. The MTR for every intracranial voxel was calculated as follows: MTR = (M0 - MS)/M0*100%, creating a MTR map for each subject. As first analysis, the whole white matter (WM) was used to mask the MTR image in order to create an histogram of the MTR distribution in the whole tissue class over the two groups examined. Then, for each subject, we performed a segmentation and parcellation of the brain by means of Freesurfer software, starting from the high resolution T1-weighted anatomical acquisition. Cortical parcellations was used to assign a label to the underlying white matter by the construction of a Voronoi diagram in the WM voxels of the MR volume based on distance to the nearest cortical parcellation label. This procedure allowed us to subdivide the cerebral WM in 78 ROIs according to the cortical parcellation (see example in Fig 1). The cerebellum, by the same procedure, was subdivided in 5 ROIs (2 per each hemisphere and one corresponding to the brainstem). For each subject, we calculated the mean value of MTR within each ROI and averaged over controls and patients. Significant differences between the two groups were tested using a two sample T-test (p<0.01). Results: Neurological examination showed that no patient met the clinical criteria of Fragile X Tremor and Ataxia Syndrome yet. Nonetheless, premutation carriers showed some subtle neurological signs of the disorder. In fact, premutation carriers showed a significant increase of tremor (CRST, T-test p=0.007) and increase of ataxia (ICARS, p=0.004) when compared to controls. The neuropsychological evaluation was normal in both groups. To obtain general characterizations of myelination for each subject and premutation carriers, we first computed the distribution of MTR values across the total white matter volume and averaged for each group. We tested the equality of the two distributions with the non parametric Kolmogorov-Smirnov test and we rejected the null-hypothesis at a p=0.03 (fig. 2). As expected, when comparing the asymptomatic permutation carriers with control subjects, the peak value and peak position of the MTR values within the whole WM were decreased and the width of the distribution curve was increased (p<0.01). These three changes point to an alteration of the global myelin status of the premutation carriers. Subsequently, to analyze the regional myelination and white matter integrity of the same group, we performed a ROI analysis of MTR data. The ROI-based analysis showed a decrease of mean MTR value in premutation carriers compared to controls in bilateral orbito-frontal and inferior frontal WM, entorhinal and cingulum regions and cerebellum (Fig 3). The detection of these differences in these regions failed with other conventional MR techniques. Conclusions: These preliminary data confirm that in premutation carriers, there are indeed alterations in "normal appearing white matter" (NAWM) and these alterations are visible with the MT technique. These results indicate that MT imaging may be a relevant approach to detect both global and local alterations within NAWM in "asymptomatic" carriers of premutations in the Fragile X Mental Retardation 1 (FMR1) gene. The sensitivity of MT in the detection of these alterations might point towards a specific physiopathological mechanism linked to an underlying myelin disorder. ROI-based analyses show that the frontal, parahippocampal and cerebellar regions are already significantly affected before the onset of symptoms. A larger sample will allow us to determine the minimum CGG expansion and age associated with these subclinical white matter alterations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION AND AIMS: Satisfaction of inpatients with served food within a hospital care system still constitutes one of the main attempts to modernize food services. The impact of type of menu, food category, hospital centre and timetable on the meals wastage produced in different Spanish healthcare settings, was evaluated. METHODS: Meal wastage was measured through a semiquantitative 5-point scale ("nothing on plate"; "¼ on plate"; "half on plate"; "¾ on plate" and "all on plate"). The study was carried out in two periods of three months each in 2010 and 2011. A trained person took charge of measuring plate waste classified into 726 servings belonging to 11 menus. In total 31,392 plates were served to 7,868 inpatients. A Kruskal-Wallis non-parametric test (p < 0.05) was applied to evaluate significant differences among the variables studied. RESULTS: The menus were satisfactorily consumed because more than 50% of the plates were classified as "nothing on plate". Regarding food categories, 26.78% of the plates corresponded to soups and purées, while pasta and rice, and prepared foods were only distributed in 4-5% of the servings. Desserts were mostly consumed, while cooked vegetables were less accepted by the inpatients evaluated. Other factors such as hospital centre influenced plate waste (p < 0.05) but timetable did not (p > 0.05). CONCLUSION: Visual inspections of plate waste might be useful to optimize type and quality of menus served. The type of menu served and the food category could have a great influence on food acceptability by the inpatientsstudied.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Our project aims at analyzing the relevance of economic factors (mainly income and other socioeconomic characteristics of Spanish households and market prices) on the prevalence of obesity in Spain and to what extent market intervention prices are effective to reduce obesity and improve the quality of the diet, and under what circumstances. In relation to the existing literature worldwide, this project is the first attempt in Spain trying to get an overall picture on the effectiveness of public policies on both food consumption and the quality of diet, on one hand, and on the prevalence of obesity on the other. The project consists of four main parts. The first part represents a critical review of the literature on the economic approach of dealing with the obesity prevalence problems, diet quality and public intervention policies. Although another important body of obesity literature is dealing with physical exercise but in this paper we will limit our attention to those studies related to food consumption respecting the scope of our study and as there are many published literature review dealing with the literature related to the physical exercise and its effect on obesity prevalence. The second part consists of a Parametric and Non-Parametric Analysis of the Role of Economic Factors on Obesity Prevalence in Spain. The third part is trying to overcome the shortcomings of many diet quality indices that have been developed during last decades, such as the Healthy Eating Index, the Diet Quality Index, the Healthy Diet Indicator, and the Mediterranean Diet Score, through the development of a new obesity specific diet quality index. While the last part of our project concentrates on the assessment of the effectiveness of market intervention policies to improve the healthiness of the Spanish Diet Using the new Exact Affine Stone Index (EASI) Demand System.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced