926 resultados para precision experiment
Resumo:
Retrieving information from Twitter is always challenging due to its large volume, inconsistent writing and noise. Most existing information retrieval (IR) and text mining methods focus on term-based approach, but suffers from the problems of terms variation such as polysemy and synonymy. This problem deteriorates when such methods are applied on Twitter due to the length limit. Over the years, people have held the hypothesis that pattern-based methods should perform better than term-based methods as it provides more context, but limited studies have been conducted to support such hypothesis especially in Twitter. This paper presents an innovative framework to address the issue of performing IR in microblog. The proposed framework discover patterns in tweets as higher level feature to assign weight for low-level features (i.e. terms) based on their distributions in higher level features. We present the experiment results based on TREC11 microblog dataset and shows that our proposed approach significantly outperforms term-based methods Okapi BM25, TF-IDF and pattern based methods, using precision, recall and F measures.
Resumo:
Background: A random QTL effects model uses a function of probabilities that two alleles in the same or in different animals at a particular genomic position are identical by descent (IBD). Estimates of such IBD probabilities and therefore, modeling and estimating QTL variances, depend on marker polymorphism, strength of linkage and linkage disequilibrium of markers and QTL, and the relatedness of animals in the pedigree. The effect of relatedness of animals in a pedigree on IBD probabilities and their characteristics was examined in a simulation study. Results: The study based on nine multi-generational family structures, similar to a pedigree structure of a real dairy population, distinguished by an increased level of inbreeding from zero to 28 % across the studied population. Highest inbreeding level in the pedigree, connected with highest relatedness, was accompanied by highest IBD probabilities of two alleles at the same locus, and by lower relative variation coefficients. Profiles of correlation coefficients of IBD probabilities along the marked chromosomal segment with those at the true QTL position were steepest when the inbreeding coefficient in the pedigree was highest. Precision of estimated QTL location increased with increasing inbreeding and pedigree relatedness. A method to assess the optimum level of inbreeding for QTL detection is proposed, depending on population parameters. Conclusions: An increased overall relationship in a QTL mapping design has positive effects on precision of QTL position estimates. But the relationship of inbreeding level and the capacity for QTL detection depending on the recombination rate of QTL and adjacent informative marker is not linear. © 2010 Freyer et al., licensee BioMed Central Ltd.
Resumo:
Basing on the character that Fiber Bragg Grating (FBG) is sensitive to both temperature and strain, by using Al and Fe-Ni alloy’s bimetal structure, we successfully design and manufacture a high accuracy FBG temperature sensor for earthquake premonition. Furthermore, we analyze the accuracy of the FBG sensors with enhanced sensitivity for the first time, and get its accuracy is up to ±0.05℃ with highest resolution ever in all FBG temperature sensors (0.0014℃/pm). This work experimentally proves the feasibility of using FBG in the earthquake premonition monitoring, and builds the foundation for the application of optic technology in earthquake premonition monitoring.
Resumo:
Background: Bicycle commuting in an urban environment of high air pollution is known as a potential health risk, especially for susceptible individuals. While risk management strategies aimed to reduce motorised traffic emissions exposure have been suggested, limited studies have assessed the utility of such strategies in real-world circumstances. Objectives: The potential of reducing exposure to ultrafine particles (UFP; < 0.1 µm) during bicycle commuting by lowering interaction with motorised traffic was investigated with real-time air pollution and acute inflammatory measurements in healthy individuals using their typical, and an alternative to their typical, bicycle commute route. Methods: Thirty-five healthy adults (mean ± SD: age = 39 ± 11 yr; 29% female) each completed two return trips of their typical route (HIGH) and a pre-determined altered route of lower interaction with motorised traffic (LOW; determined by the proportion of on-road cycle paths). Particle number concentration (PNC) and diameter (PD) were monitored in real-time in-commute. Acute inflammatory indices of respiratory symptom incidence, lung function and spontaneous sputum (for inflammatory cell analyses) were collected immediately pre-commute, and one and three hours post-commute. Results: LOW resulted in a significant reduction in mean PNC (1.91 x e4 ± 0.93 x e4 ppcc vs. 2.95 x e4 ± 1.50 x e4 ppcc; p ≤ 0.001). Besides incidence of in-commute offensive odour detection (42 vs. 56 %; p = 0.019), incidence of dust and soot observation (33 vs. 47 %; p = 0.038) and nasopharyngeal irritation (31 vs. 41 %; p = 0.007), acute inflammatory indices were not significantly associated to in-commute PNC, nor were these indices reduced with LOW compared to HIGH. Conclusions: Exposure to PNC, and the incidence of offensive odour and nasopharyngeal irritation, can be significantly reduced when utilising a strategy of lowering interaction with motorised traffic whilst bicycle commuting, which may bring important benefits for both healthy and susceptible individuals.
Resumo:
Deterministic computer simulation of physical experiments is now a common technique in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This paper aims to discuss some practical issues when designing a computer simulation and/or experiments for manufacturing systems. A case study approach is reviewed and presented.
Resumo:
Using historical narrative and extensive archival research, this thesis portrays the story of the twentieth century Queensland Rural Schools. The initiative started at Nambour Primary School in 1917, and extended over the next four decades to encompass thirty primary schools that functioned as centralized institutions training children in agricultural science, domestic science, and manual trade training. The Rural Schools formed the foundation of a systemised approach to agricultural education intended to facilitate the State’s closer settlement ideology. The purpose of the Rural Schools was to mitigate urbanisation, circumvent foreign incursion and increase Queensland’s productivity by turning boys into farmers, or the tradesmen required to support them, and girls into the homemakers that these farmers needed as wives and mothers for the next generation. Effectively Queensland took rural boys and girls and created a new yeomanry to aid the State’s development.
Resumo:
This project researched the performance of emerging digital technology for high voltage electricity substations that significantly improves safety for staff and reduces the potential impact on the environment of equipment failure. The experimental evaluation used a scale model of a substation control system that incorporated real substation control and networking equipment with real-time simulation of the power system. The outcomes confirm that it is possible to implement Ethernet networks in high voltage substations that meet the needs of utilities; however component-level testing of devices is necessary to achieve this. The assessment results have been used to further develop international standards for substation communication and precision timing.
Resumo:
Process mining encompasses the research area which is concerned with knowledge discovery from event logs. One common process mining task focuses on conformance checking, comparing discovered or designed process models with actual real-life behavior as captured in event logs in order to assess the “goodness” of the process model. This paper introduces a novel conformance checking method to measure how well a process model performs in terms of precision and generalization with respect to the actual executions of a process as recorded in an event log. Our approach differs from related work in the sense that we apply the concept of so-called weighted artificial negative events towards conformance checking, leading to more robust results, especially when dealing with less complete event logs that only contain a subset of all possible process execution behavior. In addition, our technique offers a novel way to estimate a process model’s ability to generalize. Existing literature has focused mainly on the fitness (recall) and precision (appropriateness) of process models, whereas generalization has been much more difficult to estimate. The described algorithms are implemented in a number of ProM plugins, and a Petri net conformance checking tool was developed to inspect process model conformance in a visual manner.
Resumo:
In a classification problem typically we face two challenging issues, the diverse characteristic of negative documents and sometimes a lot of negative documents that are closed to positive documents. Therefore, it is hard for a single classifier to clearly classify incoming documents into classes. This paper proposes a novel gradual problem solving to create a two-stage classifier. The first stage identifies reliable negatives (negative documents with weak positive characteristics). It concentrates on minimizing the number of false negative documents (recall-oriented). We use Rocchio, an existing recall based classifier, for this stage. The second stage is a precision-oriented “fine tuning”, concentrates on minimizing the number of false positive documents by applying pattern (a statistical phrase) mining techniques. In this stage a pattern-based scoring is followed by threshold setting (thresholding). Experiment shows that our statistical phrase based two-stage classifier is promising.
Resumo:
The generic alliance game considers players in an alliance who fight against an external enemy. After victory, the alliance may break up, and its members may fight against each other over the spoils of the victory. Our experimental analysis of this game shows: In-group solidarity vanishes after the break-up of the alliance. Former ‘brothers in arms’ fight even more vigorously against each other than strangers do. Furthermore, this vigorous internal fighting is anticipated and reduces the ability of the alliance to mobilize the joint fighting effort, compared to a situation in which victorious alliance members share the spoils of victory equally and peacefully
Resumo:
While forensic psychology is commonly associated with the criminal and family law domains, its ambit to offer skills and knowledge at the legal interface also makes it particularly suited to the civil law domain. At this time, civil law is arguably the least represented legislative area in terms of psychological research and professional commentary. However, it is also a broad area, with its very breadth providing scope for research consideration, as urged by Greene. The purposes of this article are (1) to review the broad role of the psychologist in the conduct of civil litigation matters in Australia, (2) to assist the novice to the area by indicating a non-exhaustive list of potentially ambiguous terms and concepts common to the conduct of professional practice, and; (3) to highlight, as an example, one area of practice not only where legal direction demands professional pragmatism but also where opportunity arises for psychological research to vitally address a major social issue.
Resumo:
We exploit a voting reform in France to estimate the causal effect of exit poll information on turnout and bandwagon voting. Before the change in legislation, individuals in some French overseas territories voted after the election result had already been made public via exit poll information from mainland France. We estimate that knowing the exit poll information decreases voter turnout by about 12 percentage points. Our study is the first clean empirical design outside of the laboratory to demonstrate the effect of such knowledge on voter turnout. Furthermore, we find that exit poll information significantly increases bandwagon voting; that is, voters who choose to turn out are more likely to vote for the expected winner.
Resumo:
One of very few field experiments in tax compliance, this study generates a unique data set on Swiss taxpayers’ underdeclaration of income and wealth and overdeduction of tax credits by obtaining exclusive access to tax-return corrections made by the tax administration. Using this commune-level data from Switzerland, it explores the influence on tax compliance of moral suasion, introduced through a treatment in which taxpayers receive a letter containing normative appeals signed by the commune’s fiscal commissioner. This letter also serves to operationalize elements of social identity and (mutual) trust. Interestingly, the results not only echo the earlier finding that moral suasion has barely any effect on taxpayer compliance, but show clear differences between underdeclaration and overdeduction.
Resumo:
In the last years several works have investigated a formal model for Information Retrieval (IR) based on the mathematical formalism underlying quantum theory. These works have mainly exploited geometric and logical–algebraic features of the quantum formalism, for example entanglement, superposition of states, collapse into basis states, lattice relationships. In this poster I present an analogy between a typical IR scenario and the double slit experiment. This experiment exhibits the presence of interference phenomena between events in a quantum system, causing the Kolmogorovian law of total probability to fail. The analogy allows to put forward the routes for the application of quantum probability theory in IR. However, several questions need still to be addressed; they will be the subject of my PhD research