935 resultados para test data generation
Resumo:
We consider forecasting using a combination, when no model coincides with a non-constant data generation process (DGP). Practical experience suggests that combining forecasts adds value, and can even dominate the best individual device. We show why this can occur when forecasting models are differentially mis-specified, and is likely to occur when the DGP is subject to location shifts. Moreover, averaging may then dominate over estimated weights in the combination. Finally, it cannot be proved that only non-encompassed devices should be retained in the combination. Empirical and Monte Carlo illustrations confirm the analysis.
Resumo:
We present five new cloud detection algorithms over land based on dynamic threshold or Bayesian techniques, applicable to the Advanced Along Track Scanning Radiometer (AATSR) instrument and compare these with the standard threshold based SADIST cloud detection scheme. We use a manually classified dataset as a reference to assess algorithm performance and quantify the impact of each cloud detection scheme on land surface temperature (LST) retrieval. The use of probabilistic Bayesian cloud detection methods improves algorithm true skill scores by 8-9 % over SADIST (maximum score of 77.93 % compared to 69.27 %). We present an assessment of the impact of imperfect cloud masking, in relation to the reference cloud mask, on the retrieved AATSR LST imposing a 2 K tolerance over a 3x3 pixel domain. We find an increase of 5-7 % in the observations falling within this tolerance when using Bayesian methods (maximum of 92.02 % compared to 85.69 %). We also demonstrate that the use of dynamic thresholds in the tests employed by SADIST can significantly improve performance, applicable to cloud-test data to provided by the Sea and Land Surface Temperature Radiometer (SLSTR) due to be launched on the Sentinel 3 mission (estimated 2014).
Resumo:
This paper deals with the complex issue of reversing long-term improvements of fertility in soils derived from heathlands and acidic grasslands using sulfur-based amendments. The experiment was conducted on a former heathland and acid grassland in the U.K. that was heavily fertilized and limed with rock phosphate, chalk, and marl. The experimental work had three aims. First, to determine whether sulfurous soil amendments are able to lower pH to a level suitable for heathland and acidic grassland re-creation (approximately 3 pH units). Second, to determine what effect the soil amendments have on the available pool of some basic cations and some potentially toxic acidic cations that may affect the plant community. Third, to determine whether the addition of Fe to the soil system would sequester PO4− ions that might be liberated from rock phosphate by the experimental treatments. The application of S0 and Fe(II)SO4− to the soil was able to reduce pH. However, only the highest S0 treatment (2,000 kg/ha S) lowered pH sufficiently for heathland restoration purposes but effectively so. Where pH was lowered, basic cations were lost from the exchangeable pool and replaced by acidic cations. Where Fe was added to the soil, there was no evidence of PO4− sequestration from soil test data (Olsen P), but sequestration was apparent because of lower foliar P in the grass sward. The ability of the forb Rumex acetosella to apparently detoxify Al3+, prevalent in acidified soils, appeared to give it a competitive advantage over other less tolerant species. We would anticipate further changes in plant community structure through time, driven by Al3+ toxicity, leading to the competitive exclusion of less tolerant species. This, we suggest, is a key abiotic driver in the restoration of biotic (acidic plant) communities.
Resumo:
This work investigates the problem of feature selection in neuroimaging features from structural MRI brain images for the classification of subjects as healthy controls, suffering from Mild Cognitive Impairment or Alzheimer’s Disease. A Genetic Algorithm wrapper method for feature selection is adopted in conjunction with a Support Vector Machine classifier. In very large feature sets, feature selection is found to be redundant as the accuracy is often worsened when compared to an Support Vector Machine with no feature selection. However, when just the hippocampal subfields are used, feature selection shows a significant improvement of the classification accuracy. Three-class Support Vector Machines and two-class Support Vector Machines combined with weighted voting are also compared with the former and found more useful. The highest accuracy achieved at classifying the test data was 65.5% using a genetic algorithm for feature selection with a three-class Support Vector Machine classifier.
Resumo:
The multivariate skew-t distribution (J Multivar Anal 79:93-113, 2001; J R Stat Soc, Ser B 65:367-389, 2003; Statistics 37:359-363, 2003) includes the Student t, skew-Cauchy and Cauchy distributions as special cases and the normal and skew-normal ones as limiting cases. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis of repeated measures, pretest/post-test data, under multivariate null intercept measurement error model (J Biopharm Stat 13(4):763-771, 2003) where the random errors and the unobserved value of the covariate (latent variable) follows a Student t and skew-t distribution, respectively. The results and methods are numerically illustrated with an example in the field of dentistry.
Resumo:
Photovoltaic processing is one of the processes that have significance in semiconductor process line. It is complicated due to the no. of elements involved that directly or indirectly affect the processing and final yield. So mathematically or empirically we can’t say assertively about the results specially related with diffusion, antireflective coating and impurity poisoning. Here I have experimented and collected data on the mono-crystal silicon wafers with varying properties and outputs. Then by using neural network with available experimental data output required can be estimated which is further tested by the test data for authenticity. One can say that it’s a kind of process simulation with varying input of raw wafers to get desired yield of photovoltaic mono-crystal cells.
Resumo:
This study focuses on teachers’ opportunities and obstacles to perform skillful reading and writing instruction. It’s about the ability to accurately identify where students are in their reading and writing process and to help them develop good reading skills. It is also about the ability to recognize signs of difficulties that students may have in their written language development and to know what efforts are needed to help them advance their reading and writing skills. The research is based on teachers’ own statements and survey responses on the external conditions for teaching and on their approach, attitudes and knowledge in reading and writing. The empirical material consists of interviews, surveys and test data. The interview study was conducted with eight teachers. The questionnaire was answered by 249 teachers, while the knowledge test was conducted of 269 teachers and 31 special education teachers. Many of the teachers in this study have lack knowledge in the structure of language and common Swedish spelling rules. Furthermore, it appears that a large part of them are unaccustomed to explaining, in detail, students’ reading development and find it difficult to systematically describe the aspects of daily literacy instruction. The overall picture is that many teachers teach without having tools to reflect on how their education really affects students’ reading and writing. These shortcomings make it difficult to conduct effective literacy instruction. Once students have learned to decode or if they have reading difficulties, many teachers seem to one-sidedly focus on getting students to read more. The consequence could be that those who would need to practice more on the technical basic of reading or comprehension strategies are left without support. Lack of variety and individuality in fluency and comprehension training can challenge the students’ reading and writing development. The teachers in the study, who have the old junior school teacher and elementary teacher education, have the highest amount of knowledge of reading and writing (the test). Good education can provide student teachers with professional skills that they may develop further in their careers. Knowledge of the meaning of phonological and phonemic awareness as well as knowledge of how to count phonemes seem to be important for knowledge of reading and writing (the test). Knowledge of basic reading processes can be obtained by systematic and structured work with students’ linguistic development, and through continuous dialogues with experienced colleagues on how and why questions. This is one important way to work also in teacher training. When essential professional skills are established in the teacher education, in practice students will obtain the school’s learning goals.
Resumo:
Allt eftersom utvecklingen går framåt inom applikationer och system så förändras också sättet på vilket vi interagerar med systemet på. Hittills har navigering och användning av applikationer och system mestadels skett med händerna och då genom mus och tangentbord. På senare tid så har navigering via touch-skärmar och rösten blivit allt mer vanligt. Då man ska styra en applikation med hjälp av rösten är det viktigt att vem som helst kan styra applikationen, oavsett vilken dialekt man har. För att kunna se hur korrekt ett röstigenkännings-API (Application Programming Interface) uppfattar svenska dialekter så initierades denna studie med dokumentstudier om dialekters kännetecken och ljudkombinationer. Dessa kännetecken och ljudkombinationer låg till grund för de ord vi valt ut till att testa API:et med. Varje dialekt fick alltså ett ord uppbyggt för att vara extra svårt för API:et att uppfatta när det uttalades av just den aktuella dialekten. Därefter utvecklades en prototyp, närmare bestämt en android-applikation som fungerade som ett verktyg i datainsamlingen. Då arbetet innehåller en prototyp och en undersökning så valdes Design and Creation Research som forskningsstrategi med datainsamlingsmetoderna dokumentstudier och observationer för att få önskat resultat. Data samlades in via observationer med prototypen som hjälpmedel och med hjälp av dokumentstudier. Det empiriska data som registrerats via observationerna och med hjälp av applikationen påvisade att vissa dialekter var lättare för API:et att uppfatta korrekt. I vissa fall var resultaten väntade då vissa ord uppbyggda av ljudkombinationer i enlighet med teorin skulle uttalas väldigt speciellt av en viss dialekt. Ibland blev det väldigt låga resultat på just dessa ord men i andra fall förvånansvärt höga. Slutsatsen vi drog av detta var att de ord vi valt ut med en baktanke om att de skulle få låga resultat för den speciella dialekten endast visade sig stämma vid två tillfällen. Det var istället det ord innehållande sje- och tje-ljud som enligt teorin var gemensamma kännetecken för alla dialekter som fick lägst resultat överlag.
Resumo:
Solar plus heat pump systems are often very complex in design, with sometimes special heat pump arrangements and control. Therefore detailed heat pump models can give very slow system simulations and still not so accurate results compared to real heat pump performance in a system. The idea here is to start from a standard measured performance map of test points for a heat pump according to EN 14825 and then determine characteristic parameters for a simplified correlation based model of the heat pump. By plotting heat pump test data in different ways including power input and output form and not only as COP, a simplified relation could be seen. By using the same methodology as in the EN 12975 QDT part in the collector test standard it could be shown that a very simple model could describe the heat pump test data very accurately, by identifying 4 parameters in the correlation equation found. © 2012 The Authors.
Resumo:
A number of recent works have introduced statistical methods for detecting genetic loci that affect phenotypic variability, which we refer to as variability-controlling quantitative trait loci (vQTL). These are genetic variants whose allelic state predicts how much phenotype values will vary about their expected means. Such loci are of great potential interest in both human and non-human genetic studies, one reason being that a detected vQTL could represent a previously undetected interaction with other genes or environmental factors. The simultaneous publication of these new methods in different journals has in many cases precluded opportunity for comparison. We survey some of these methods, the respective trade-offs they imply, and the connections between them. The methods fall into three main groups: classical non-parametric, fully parametric, and semi-parametric two-stage approximations. Choosing between alternatives involves balancing the need for robustness, flexibility, and speed. For each method, we identify important assumptions and limitations, including those of practical importance, such as their scope for including covariates and random effects. We show in simulations that both parametric methods and their semi-parametric approximations can give elevated false positive rates when they ignore mean-variance relationships intrinsic to the data generation process. We conclude that choice of method depends on the trait distribution, the need to include non-genetic covariates, and the population size and structure, coupled with a critical evaluation of how these fit with the assumptions of the statistical model.
Resumo:
The recent advances in CMOS technology have allowed for the fabrication of transistors with submicronic dimensions, making possible the integration of tens of millions devices in a single chip that can be used to build very complex electronic systems. Such increase in complexity of designs has originated a need for more efficient verification tools that could incorporate more appropriate physical and computational models. Timing verification targets at determining whether the timing constraints imposed to the design may be satisfied or not. It can be performed by using circuit simulation or by timing analysis. Although simulation tends to furnish the most accurate estimates, it presents the drawback of being stimuli dependent. Hence, in order to ensure that the critical situation is taken into account, one must exercise all possible input patterns. Obviously, this is not possible to accomplish due to the high complexity of current designs. To circumvent this problem, designers must rely on timing analysis. Timing analysis is an input-independent verification approach that models each combinational block of a circuit as a direct acyclic graph, which is used to estimate the critical delay. First timing analysis tools used only the circuit topology information to estimate circuit delay, thus being referred to as topological timing analyzers. However, such method may result in too pessimistic delay estimates, since the longest paths in the graph may not be able to propagate a transition, that is, may be false. Functional timing analysis, in turn, considers not only circuit topology, but also the temporal and functional relations between circuit elements. Functional timing analysis tools may differ by three aspects: the set of sensitization conditions necessary to declare a path as sensitizable (i.e., the so-called path sensitization criterion), the number of paths simultaneously handled and the method used to determine whether sensitization conditions are satisfiable or not. Currently, the two most efficient approaches test the sensitizability of entire sets of paths at a time: one is based on automatic test pattern generation (ATPG) techniques and the other translates the timing analysis problem into a satisfiability (SAT) problem. Although timing analysis has been exhaustively studied in the last fifteen years, some specific topics have not received the required attention yet. One such topic is the applicability of functional timing analysis to circuits containing complex gates. This is the basic concern of this thesis. In addition, and as a necessary step to settle the scenario, a detailed and systematic study on functional timing analysis is also presented.
Resumo:
The aim was to study the efficiency of electrical conductivity test in evaluating the quality of castor seeds. Emergence of seedlings was evaluated in the field, with five seed lots of cv. AL Guarany 2002. The seeds were submitted to the following tests: germination; first count of germination; accelerated aging (45 degrees C by 24 to 100% UR), emergence of seedlings in the field, rate of speed of development of seedlings and electrical conductivity, testing the periods (2, 4, 6, 8 and 24 hours) and the number of seeds (25,50 and 75) in 75ml of distilled water conditioning to 25 degrees C. A completely randomized design was used. The averages were compared by the Tukey test at 5%. The mean laboratory and field test data were correlated. Electrical conductivity test for 4 hours with 25 seeds and 6 hours with 50 seeds, and the rate of speed emergency proved efficient in the selection of lots vigor by providing information equivalent to the emergence of seedlings in the field.
Resumo:
Experiments were performed to study the effect of surface properties of a vertical channel heated by a source of thermal radiation to induce air flow through convection. Two channels (solar chimney prototype) were built with glass plates, forming a structure of truncated pyramidal geometry. We considered two surface finishes: transparent and opaque. Each stack was mounted on a base of thermal energy absorber with a central opening for passage of air, and subjected to heating by a radiant source comprises a bank of incandescent bulbs and were performed field tests. Thermocouples were fixed on the bases and on the walls of chimneys and then connected to a data acquisition system in computer. The air flow within the chimney, the speed and temperature were measured using a hot wire anemometer. Five experiments were performed for each stack in which convective flows were recorded with values ranging from 17 m³ / h and 22 m³ / h and air flow velocities ranging from 0.38 m / s and 0.56 m / s for the laboratory tests and air velocities between 0.6 m/s and 1.1m/s and convective airflows between 650 m³/h and 1150 m³/h for the field tests. The test data were compared to those obtained by semi-empirical equations, which are valid for air flow induced into channels and simulated data from 1st Thermodynamics equation. It was found that the chimney with transparent walls induced more intense convective flows than the chimney with matte finish. Based on the results obtained can be proposed for the implementation of prototype to exhaust fumes, mists, gases, vapors, mists and dusts in industrial environments, to help promote ventilation and air renewal in built environments and for drying materials, fruits and seeds
Resumo:
Os objetivos do estudo foram avaliar o alinhamento, no plano sagital, da coluna de indivíduos com alterações na medida da gibosidade, comparando com um grupo sem alterações; testar a confiabilidade do instrumento utilizado e verificar se existem correlações entre as medidas da gibosidade e os valores das curvaturas vertebrais. Foram avaliados 40 jovens, divididos em grupo controle - ausência ou presença de gibosidades inferiores a 0,5 cm na curvatura torácica e 0,7 cm na lombar (n=20) e, grupo experimental - gibosidades superiores às descritas (n=20). A gibosidade e as curvaturas no plano sagital foram mensuradas com um instrumento adaptado a um nível d'água e o teste de Adams. As coletas foram realizadas em duas datas distintas, nos dois grupos. Após aplicação do teste Mann-Whitney não foi encontrada diferença entre as ocasiões de coletas e, emparelhando-se os grupos, foi encontrada diferença apenas na medida cervical. Na verificação de existência de relação entre as medidas coletadas, foi encontrada correlação linear (Spearmann) no grupo controle - curvatura torácica e gibosidade torácica; em ambos os grupos - curvaturas torácica e lombar; e no grupo experimental - gibosidade torácica e as curvaturas lombar e sacral e, curvatura sacral e curvaturas torácica e lombar. Pôde-se concluir que a medida da gibosidade tem relações com as curvaturas no plano sagital. Por ser um método confiável, simples e acessível, pode ser reproduzido sem altos custos financeiros e sem causar prejuízo à saúde do paciente.
Resumo:
According to the studies in Applied Linguistics, this thesis is based on an interdisciplinary perspective (Critical Discourse Analysis, Sociology towards Social Change, Cultural Studies and Systemic-Functional Linguistics). The overall objective of the research was to analyze the discourses of Elementary School teachers in the state of Sergipe, by means of the discursive representations of the social actors, the processes of subjectivity and their fragmented identities in the context of standardized evaluations before the requirements of globalized pedagogical practices, based on the result-based management. The critical analysis of such discourses was motivated by the rapid pace with which the demands of innovation become part of the classroom, aiming at reaching the target in what concerns the indexes of the rankings which characterize the globalized discourse of the national education management, like Ideb (Basic Education Development Index), which makes teachers change their discourses, become silent or keep resistant. The work was initially endorsed by the theoretical lines of the Critical Discourse Analysis (FAIRCLOUGH, 2001, 2006), and poses a proposal for such purpose: the ASCD Discourse Sociological and Communicative Approach (PEDROSA, 2012, 2013). This is an interpretative-qualitative study of the Critical Discourse Analysis (FAIRCLOUGH, 2001, 2003; RAMALHO ; RESENDE, 2011) and to carry it out, semi-structured interviews were used as instruments of data generation (BAUER; GASKELL, 2011; GILL, 2011). Its corpus is composed of thirteen accounts of teachers from the Elementary school who teach Portuguese and work in the fifteen schools which were chosen to be the universe of the research at the Regional Board of Education (02) in the state of Sergipe. Such narratives are related to their impressions, expectations and actions which favor the management of results to which they have to submit themselves. The analytical overview of sociological and discursive line comes from the pan-semiotic categories (Inclusion and Exclusion) which appear in the theory of Representation of Social Actors (VAN LEEUWEN, 1997, 2008). To present the processes of subjectivity of these teachers, this work is based on the socio-analytical proposal of the classification of the subjects, which stems from the individual s work in the Gestão Relacional de Si , which comes from the Applied Sociology (towards) Social Change (BAJOIT, 2006, 2009). The discursive analyses were guided word for word, in their majority, by having the Systemic Functional Grammar as their theoretical basis, specifically by the processes of the Transitivity System postulated by Halliday, (1985); Halliday and Mathiessen, (2004); Eggins (2004); Cunha and Souza (2011). The work makes the field of Cultural Studies emerge towards the dialogue and the presentation of the fragmented identities of the teachers in the context of late modernity (GIDDENS, 2002; HALL, 2011). The thesis promoted a reflection over the teacher s condition, who is immerse in this context of knowledge construction of the present Brazilian educational system, the standardized evaluations, the indexes of development, the targets and the rankings. The considerations and outcomes of such a research dealt with the teachers emerging social practices and the need of planned initial and continuing teacher education towards the new moment which is foreseeable