982 resultados para casuality testing in VaRs with bootstrapping
Resumo:
Perceptual grouping by luminance similarity and by proximity was investigated in infants with Williams syndrome (WS) aged between 6 and 36 months (visit 1, N=29). WS infants who were still under 36 months old, 8 months later, repeated the testing procedure (visit 2, N=15). Performance was compared to typically developing (TD) infants aged from 2 to 20 months (N=63). Consistent with the literature, TD participants showed grouping by luminance at the youngest testing age, 2 months. Grouping by proximity had not previous been charted in typical development: this study showed grouping by proximity at 8 months. Infants with WS could group by luminance. Developmental progression of the WS group showed some similarities to typical development, although further investigation is required to further address this in more depth. In contrast, infants with WS were not able to group by proximity. This pattern of emergence and development of grouping abilities is considered in relation to the pattern of grouping abilities observed in adults with WS.
Resumo:
Molecular and behavioural evidence points to an association between sex-steroid hormones and autism spectrum conditions (ASC) and/or autistic traits. Prenatal androgen levels are associated with autistic traits, and several genes involved in steroidogenesis are associated with autism, Asperger Syndrome and/or autistic traits. Furthermore, higher rates of androgen-related conditions (such as Polycystic Ovary Syndrome, hirsutism, acne and hormone-related cancers) are reported in women with autism spectrum conditions. A key question therefore is if serum levels of gonadal and adrenal sex-steroids (particularly testosterone, estradiol, dehydroepiandrosterone sulfate and androstenedione) are elevated in individuals with ASC. This was tested in a total sample of n=166 participants. The final eligible sample for hormone analysis comprised n=128 participants, n=58 of whom had a diagnosis of Asperger Syndrome or high functioning autism (33 males and 25 females) and n=70 of whom were age- and IQ-matched typical controls (39 males and 31 females). ASC diagnosis (without any interaction with sex) strongly predicted androstenedione levels (p<0.01), and serum androstenedione levels were significantly elevated in the ASC group (Mann-Whitney W=2677, p=0.002), a result confirmed by permutation testing in females (permutation-corrected p=0.02). This result is discussed in terms of androstenedione being the immediate precursor of, and being converted into, testosterone, dihydrotestosterone, or estrogens in hormone-sensitive tissues and organs.
Resumo:
Developments in high-throughput genotyping provide an opportunity to explore the application of marker technology in distinctness, uniformity and stability (DUS) testing of new varieties. We have used a large set of molecular markers to assess the feasibility of a UPOV Model 2 approach: “Calibration of threshold levels for molecular characteristics against the minimum distance in traditional characteristics”. We have examined 431 winter and spring barley varieties, with data from UK DUS trials comprising 28 characteristics, together with genotype data from 3072 SNP markers. Inter varietal distances were calculated and we found higher correlations between molecular and morphological distances than have been previously reported. When varieties were grouped by kinship, phenotypic and genotypic distances of these groups correlated well. We estimated the minimum marker numbers required and showed there was a ceiling after which the correlations do not improve. To investigate the possibility of breaking through this ceiling, we attempted genomic prediction of phenotypes from genotypes and higher correlations were achieved. We tested distinctness decisions made using either morphological or genotypic distances and found poor correspondence between each method.
Resumo:
Differences-in-Differences (DID) is one of the most widely used identification strategies in applied economics. However, how to draw inferences in DID models when there are few treated groups remains an open question. We show that the usual inference methods used in DID models might not perform well when there are few treated groups and errors are heteroskedastic. In particular, we show that when there is variation in the number of observations per group, inference methods designed to work when there are few treated groups tend to (under-) over-reject the null hypothesis when the treated groups are (large) small relative to the control groups. This happens because larger groups tend to have lower variance, generating heteroskedasticity in the group x time aggregate DID model. We provide evidence from Monte Carlo simulations and from placebo DID regressions with the American Community Survey (ACS) and the Current Population Survey (CPS) datasets to show that this problem is relevant even in datasets with large numbers of observations per group. We then derive an alternative inference method that provides accurate hypothesis testing in situations where there are few treated groups (or even just one) and many control groups in the presence of heteroskedasticity. Our method assumes that we can model the heteroskedasticity of a linear combination of the errors. We show that this assumption can be satisfied without imposing strong assumptions on the errors in common DID applications. With many pre-treatment periods, we show that this assumption can be relaxed. Instead, we provide an alternative inference method that relies on strict stationarity and ergodicity of the time series. Finally, we consider two recent alternatives to DID when there are many pre-treatment periods. We extend our inference methods to linear factor models when there are few treated groups. We also derive conditions under which a permutation test for the synthetic control estimator proposed by Abadie et al. (2010) is robust to heteroskedasticity and propose a modification on the test statistic that provided a better heteroskedasticity correction in our simulations.
Resumo:
Differences-in-Differences (DID) is one of the most widely used identification strategies in applied economics. However, how to draw inferences in DID models when there are few treated groups remains an open question. We show that the usual inference methods used in DID models might not perform well when there are few treated groups and errors are heteroskedastic. In particular, we show that when there is variation in the number of observations per group, inference methods designed to work when there are few treated groups tend to (under-) over-reject the null hypothesis when the treated groups are (large) small relative to the control groups. This happens because larger groups tend to have lower variance, generating heteroskedasticity in the group x time aggregate DID model. We provide evidence from Monte Carlo simulations and from placebo DID regressions with the American Community Survey (ACS) and the Current Population Survey (CPS) datasets to show that this problem is relevant even in datasets with large numbers of observations per group. We then derive an alternative inference method that provides accurate hypothesis testing in situations where there are few treated groups (or even just one) and many control groups in the presence of heteroskedasticity. Our method assumes that we know how the heteroskedasticity is generated, which is the case when it is generated by variation in the number of observations per group. With many pre-treatment periods, we show that this assumption can be relaxed. Instead, we provide an alternative application of our method that relies on assumptions about stationarity and convergence of the moments of the time series. Finally, we consider two recent alternatives to DID when there are many pre-treatment groups. We extend our inference method to linear factor models when there are few treated groups. We also propose a permutation test for the synthetic control estimator that provided a better heteroskedasticity correction in our simulations than the test suggested by Abadie et al. (2010).
Resumo:
Tests on printed circuit boards and integrated circuits are widely used in industry,resulting in reduced design time and cost of a project. The functional and connectivity tests in this type of circuits soon began to be a concern for the manufacturers, leading to research for solutions that would allow a reliable, quick, cheap and universal solution. Initially, using test schemes were based on a set of needles that was connected to inputs and outputs of the integrated circuit board (bed-of-nails), to which signals were applied, in order to verify whether the circuit was according to the specifications and could be assembled in the production line. With the development of projects, circuit miniaturization, improvement of the production processes, improvement of the materials used, as well as the increase in the number of circuits, it was necessary to search for another solution. Thus Boundary-Scan Testing was developed which operates on the border of integrated circuits and allows testing the connectivity of the input and the output ports of a circuit. The Boundary-Scan Testing method was converted into a standard, in 1990, by the IEEE organization, being known as the IEEE 1149.1 Standard. Since then a large number of manufacturers have adopted this standard in their products. This master thesis has, as main objective: the design of Boundary-Scan Testing in an image sensor in CMOS technology, analyzing the standard requirements, the process used in the prototype production, developing the design and layout of Boundary-Scan and analyzing obtained results after production. Chapter 1 presents briefly the evolution of testing procedures used in industry, developments and applications of image sensors and the motivation for the use of architecture Boundary-Scan Testing. Chapter 2 explores the fundamentals of Boundary-Scan Testing and image sensors, starting with the Boundary-Scan architecture defined in the Standard, where functional blocks are analyzed. This understanding is necessary to implement the design on an image sensor. It also explains the architecture of image sensors currently used, focusing on sensors with a large number of inputs and outputs.Chapter 3 describes the design of the Boundary-Scan implemented and starts to analyse the design and functions of the prototype, the used software, the designs and simulations of the functional blocks of the Boundary-Scan implemented. Chapter 4 presents the layout process used based on the design developed on chapter 3, describing the software used for this purpose, the planning of the layout location (floorplan) and its dimensions, the layout of individual blocks, checks in terms of layout rules, the comparison with the final design and finally the simulation. Chapter 5 describes how the functional tests were performed to verify the design compliancy with the specifications of Standard IEEE 1149.1. These tests were focused on the application of signals to input and output ports of the produced prototype. Chapter 6 presents the conclusions that were taken throughout the execution of the work.
Resumo:
Objective: To characterize articular and systemic inflammatory activity in juvenile idiopathic arthritis (JIA), identifying remission status with and without medication.Methods: A total of 165 JIA cases, followed for a mean period of 3.6 years, were reviewed in order to characterize episodes of inactivity and clinical remission on and off medication. The resulting data were analyzed by means of descriptive statistics, survival analysis, by comparison of Kaplan-Meier curves, log rank testing and binary logistic regression;analysis in order to identify predictive factors for remission or persistent activity.Results: One hundred and eight of the cases reviewed fulfilled the inclusion criteria: 57 patients (52.7%) exhibited a total of 71 episodes of inactivity, with a mean of 2.9 years per episode; 36 inactivity episodes (50.7%) resulted in clinical remission off medication, 35% of which were of the persistent oligoarticular subtype. The probability of clinical remission on medication over 2 years was 81, 82, 97 and 83% for cases of persistent oligoarticular, extended oligoarticular, polyarticular and systemicJIA, respectively. The probability of clinical remission off medication 5 years after onset of remission was 40 and 67% for patients with persistent oligoarticular and systemic JIA, respectively. Persistent disease activity was significantly associated with the use of an anti-rheumatic drug combination. Age at JIA onset was the only factor that predicted clinical remission (p = 0.002).Conclusions: In this cohort, the probability of JIA progressing to clinical remission was greater for the persistent oligoarticular and systemic subtypes, when compared with polyarticular cases.
Resumo:
There is a well-developed framework, the Black-Scholes theory, for the pricing of contracts based on the future prices of certain assets, called options. This theory assumes that the probability distribution of the returns of the underlying asset is a Gaussian distribution. However, it is observed in the market that this hypothesis is flawed, leading to the introduction of a fudge factor, the so-called volatility smile. Therefore, it would be interesting to explore extensions of the Black-Scholes theory to non-Gaussian distributions. In this paper, we provide an explicit formula for the price of an option when the distributions of the returns of the underlying asset is parametrized by an Edgeworth expansion, which allows for the introduction of higher independent moments of the probability distribution, namely skewness and kurtosis. We test our formula with options in the Brazilian and American markets, showing that the volatility smile can be reduced. We also check whether our approach leads to more efficient hedging strategies of these instruments. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
To investigate the feasibility and validity of sampling blood from the carpal pad in hospitalised healthy and diabetic dogs. METHODS The carpal pad was compared to the ear as a sampling site in 60 dogs (30 healthy and 30 diabetic dogs). RESULTS Lancing the pads was very well tolerated. The average glucose concentrations in blood samples obtained from the ears and carpal pads exhibited a strong positive correlation (r = 0.938) and there were no significant differences between them (P = 0.914). In addition, 98.3% of the values obtained were clinically acceptable when assessed by the error grid analysis. CLINICAL SIGNIFICANCE The carpal pad is a good alternative sampling site for home monitoring, especially in animals with a soft and/or light-coloured pad.
Resumo:
Background: program for phonological remediation in developmental dyslexia. Aim: to verify the efficacy of a program for phonological remediation in students with developmental dyslexia. Specific goals of this study involved the comparison of the linguistic-cognitive performance of students with developmental dyslexia with that of students considered good readers; to compare the results obtained in pre and post-testing situations of students with dyslexia who were and were not submitted to the program; and to compare the results obtained with the phonological remediation program in students with developmental dyslexia to those obtained in good readers. Method: participants of this study were 24 students who were divided as follows: Group I (GI) was divided in two other groups - Gle with 6 students with developmental dyslexia who were submitted to the program; and Glc with 6 students with developmental dyslexia who were not submitted to the program; Group II (GII) was also divided in two other groups - GIIe with 6 good readers who were submitted to the program, and GIIc with 6 good readers who were not submitted to the program. The phonological remediation program (Gonzalez & Rosquete, 2002) was developed in three stages: pre-testing, training and post-testing. Results: results indicate that GI presented a lower performance in phonological skills, reading and writing when compared to GII in the pre-testing situation. However, GIe presented a similar performance to that of GII in the post-testing situation, indicating the effectiveness of the phonological remediation program in students with developmental dyslexia. Conclusion: this study made evident the effectiveness of the phonological remediation program in students with developmental dyslexia.
Resumo:
This paper discusses the investigation of an abrasive process for finishing flat workpieces, based on the combination of important grinding and lapping characteristics. Instead of loose abrasive grains between the workpiece and the lapping plate, a resinoid grinding wheel of hot-pressed silicon carbide is placed on the plate of a device resembling a lapping machine. The resin bond grinding wheel is dressed with a single-point diamond. In addition to keeping the plate flat, dressing also plays the role of interfering in the behavior of the process by varying the overlap factor (Ud). It was found that the studied process simplify the set-up and can be controlled more easily than in lapping, whose is a painstaking process. The surface roughness and flatness deviation proved comparable to those of lapping, or even finer than it, with the additional advantage of a less contaminated workpiece surface with a shiny appearance. The process was also monitored by acoustic emission (AE), which indicates to be a promissing and suitable technique for use in this process. Copyright © 2008 by ASME.
Resumo:
A search is presented for physics beyond the standard model (BSM) in final states with a pair of opposite-sign isolated leptons accompanied by jets and missing transverse energy. The search uses LHC data recorded at a center-of-mass energy s=7 TeV with the CMS detector, corresponding to an integrated luminosity of approximately 5 fb-1. Two complementary search strategies are employed. The first probes models with a specific dilepton production mechanism that leads to a characteristic kinematic edge in the dilepton mass distribution. The second strategy probes models of dilepton production with heavy, colored objects that decay to final states including invisible particles, leading to very large hadronic activity and missing transverse energy. No evidence for an event yield in excess of the standard model expectations is found. Upper limits on the BSM contributions to the signal regions are deduced from the results, which are used to exclude a region of the parameter space of the constrained minimal supersymmetric extension of the standard model. Additional information related to detector efficiencies and response is provided to allow testing specific models of BSM physics not considered in this Letter. © 2012 CERN.
Resumo:
Objectives: Primary failure of tooth eruption (PFE) is a rare autosomal-dominant disease characterized by severe lateral open bite as a consequence of incomplete eruption of posterior teeth. Heterozygous mutations in the parathyroid hormone 1 receptor (PTH1R) gene have been shown to cause PFE likely due to protein haploinsufficiency. To further expand on the mutational spectrum of PFE-associated mutations, we report here on the sequencing results of the PTH1R gene in 70 index PFE cases. Materials and methods: Sanger sequencing of the PTH1R coding exons and their immediate flanking intronic sequences was performed with DNA samples from 70 index PFE cases. Results: We identified a total of 30 unique variants, of which 12 were classified as pathogenic based on their deleterious consequences on PTH1R protein while 16 changes were characterized as unclassified variants with as yet unknown effects on disease pathology. The remaining two variants represent common polymorphisms. Conclusions: Our data significantly increase the number of presently known unique PFE-causing PTH1R mutations and provide a series of variants with unclear pathogenicity which will require further in vitro assaying to determine their effects on protein structure and function. Clinical relevance: Management of PTH1R-associated PFE is problematic, in particular when teeth are exposed to orthodontic force. Therefore, upon clinical suspicion of PFE, molecular DNA testing is indicated to support decision making for further treatment options. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Objectives: The combination of sodium hypochlorite (NaOCl) and chlorhexidine (CHX) yields a precipitate potentially toxic (PPT). The aim of this study was to evaluate the tissue response to implanted polyethylene tubes filled with PPT-soaked fibrin sponge. Methods: Forty rats received four polyethylene tubes each; each tube was filled with fibrin sponge soaked by 2.5 % NaOCl, 2.0 % CHX, PPT (2.5 % NaOCl plus 2.0 % CHX), or not soaked (control). The observation time points were 7, 15, 30, 60, and 90 days. At each time point, eight animals were killed, and the tubes and surrounding tissues were removed, fixed, and prepared for light microscopic analysis by performing glycol methacrylate embedding, serial cutting into 3-μm sections, and hematoxylin-eosin staining. Qualitative and quantitative evaluations of the reactions were performed. Results were statistically analyzed by Kruskal-Wallis test (p < 0.05). Results: All chemical solutions caused moderate reactions at 7 days. On day 30, PPT group was more cytotoxic than the control group and the CHX group (p < 0.05). On days 15 and 60, PPT group was more cytotoxic than the control group (p < 0.05). On day 90, there was no statistically significant difference between the different groups. Conclusion: PPT is more cytotoxic than NaOCl and CHX alone, particularly in the short term. Clinical significance: Protocols which suggest the use of CHX and NaOCl must be revised because this mixture produces cytotoxic product. © 2013 Springer-Verlag Berlin Heidelberg.