32 resultados para Optimal test set
Resumo:
In common with other farmland species, hares (Lepus spp.) are in widespread decline in agricultural landscapes due to agricultural intensification and habitat loss. We examined the importance of habitat heterogeneity to the Irish hare (Lepus timidus hibernicus) in a pastoral landscape. We used radio-tracking during nocturnal active and diurnal inactive periods throughout one year. In autumn, winter and spring, hares occupied a heterogeneous combination of improved grassland, providing food, and Juncus-dominated rough pasture, providing refuge. In summer, hares significantly increased their use of improved grassland. This homogeneous habitat can fulfil the discrete and varied resource requirements of hares for feeding and shelter at certain times of year. However, improved grassland may be a risky habitat for hares as silage harvesting occurs during their peak birthing period of late spring and early summer. We therefore posit the existence of a putative ecological trap inherent to a homogeneous habitat of perceived high value that satisfies the hares' habitat requirements but which presents risks at a critical time of year. To test this hypothesis in relation to hare populations, work is required to provide data on differential leveret mortality between habitat types.
Resumo:
An optimal search theory, the so-called Levy-flight foraging hypothesis(1), predicts that predators should adopt search strategies known as Levy flights where prey is sparse and distributed unpredictably, but that Brownian movement is sufficiently efficient for locating abundant prey(2-4). Empirical studies have generated controversy because the accuracy of statistical methods that have been used to identify Levy behaviour has recently been questioned(5,6). Consequently, whether foragers exhibit Levy flights in the wild remains unclear. Crucially, moreover, it has not been tested whether observed movement patterns across natural landscapes having different expected resource distributions conform to the theory's central predictions. Here we use maximum-likelihood methods to test for Levy patterns in relation to environmental gradients in the largest animal movement data set assembled for this purpose. Strong support was found for Levy search patterns across 14 species of open-ocean predatory fish (sharks, tuna, billfish and ocean sunfish), with some individuals switching between Levy and Brownian movement as they traversed different habitat types. We tested the spatial occurrence of these two principal patterns and found Levy behaviour to be associated with less productive waters (sparser prey) and Brownian movements to be associated with productive shelf or convergence-front habitats (abundant prey). These results are consistent with the Levy-flight foraging hypothesis(1,7), supporting the contention(8,9) that organism search strategies naturally evolved in such a way that they exploit optimal Levy patterns.
Resumo:
We present optical (UBVRI) and near-IR (YJHK) photometry of the normal Type Ia supernova (SN) 2004S. We also present eight optical spectra and one near-IR spectrum of SN 2004S. The light curves and spectra are nearly identical to those of SN 2001el. This is the first time we have seen optical and IR light curves of two Type Ia SNe match so closely. Within the one parameter family of light curves for normal Type Ia SNe, that two objects should have such similar light curves implies that they had identical intrinsic colors and produced similar amounts of Ni-56. From the similarities of the light-curve shapes we obtain a set of extinctions as a function of wavelength that allows a simultaneous solution for the distance modulus difference of the two objects, the difference of the host galaxy extinctions, and RV. Since SN 2001el had roughly an order of magnitude more host galaxy extinction than SN 2004S, the value of R-V = 2.15(-0.22)(+0.24) pertains primarily to dust in the host galaxy of SN 2001el. We have also shown via Monte Carlo simulations that adding rest-frame J-band photometry to the complement of BVRI photometry of Type Ia SNe decreases the uncertainty in the distance modulus by a factor of 2.7. A combination of rest-frame optical and near-IR photometry clearly gives more accurate distances than using rest-frame optical photometry alone.
Resumo:
Aiming to establish a rigorous link between macroscopic random motion (described e.g. by Langevin-type theories) and microscopic dynamics, we have undertaken a kinetic-theoretical study of the dynamics of a classical test-particle weakly coupled to a large heat-bath in thermal equilibrium. Both subsystems are subject to an external force field. From the (time-non-local) generalized master equation a Fokker-Planck-type equation follows as a "quasi-Markovian" approximation. The kinetic operator thus defined is shown to be ill-defined; in specific, it does not preserve the positivity of the test-particle distribution function f(x, v; t). Adopting an alternative approach, previously introduced for quantum open systems, is proposed to lead to a correct kinetic operator, which yields all the expected properties. A set of explicit expressions for the diffusion and drift coefficients are obtained, allowing for modelling macroscopic diffusion and dynamical friction phenomena, in terms of an external field and intrinsic physical parameters.
Resumo:
Support vector machines (SVMs), though accurate, are not preferred in applications requiring high classification speed or when deployed in systems of limited computational resources, due to the large number of support vectors involved in the model. To overcome this problem we have devised a primal SVM method with the following properties: (1) it solves for the SVM representation without the need to invoke the representer theorem, (2) forward and backward selections are combined to approach the final globally optimal solution, and (3) a criterion is introduced for identification of support vectors leading to a much reduced support vector set. In addition to introducing this method the paper analyzes the complexity of the algorithm and presents test results on three public benchmark problems and a human activity recognition application. These applications demonstrate the effectiveness and efficiency of the proposed algorithm.
--------------------------------------------------------------------------------
Resumo:
The Irish case provides a particularly appropriate test of the increasing merit selection hypothesis deriving from the liberal theory of industrialization. This is so not only because the lateness and speed of economic change allows us to capture such change through a set of national surveys conducted in the past three decades, but also because such change was based on a sustained policy of increased openness to international competitive forces. The functional requirements of the economy and a rapid increase in the supply of those with higher educational qualifications provided an ideal context in which to observe the movement from ascription to achievement predicted by the liberal theory. However, while changes in the class structure and a rapid expansion of educational opportunity had significant consequences in terms of absolute mobility, there was no evidence of a significant shift towards meritocratic principles. At the same time as the service class increased their advantage over other classes in the pursuit of educational qualifications, the impact of educational qualifications on class destination diminished. Controlling for education, we find that the impact of class origin effects is substantial and shows little sign of diminishing over time. In our conclusion we discuss the implications of our findings in the context of the recent debate on meritocracy.
Resumo:
Motivation: To date, Gene Set Analysis (GSA) approaches primarily focus on identifying differentially expressed gene sets (pathways). Methods for identifying differentially coexpressed pathways also exist but are mostly based on aggregated pairwise correlations, or other pairwise measures of coexpression. Instead, we propose Gene Sets Net Correlations Analysis (GSNCA), a multivariate differential coexpression test that accounts for the complete correlation structure between genes.
Results: In GSNCA, weight factors are assigned to genes in proportion to the genes' cross-correlations (intergene correlations). The problem of finding the weight vectors is formulated as an eigenvector problem with a unique solution. GSNCA tests the null hypothesis that for a gene set there is no difference in the weight vectors of the genes between two conditions. In simulation studies and the analyses of experimental data, we demonstrate that GSNCA, indeed, captures changes in the structure of genes' cross-correlations rather than differences in the averaged pairwise correlations. Thus, GSNCA infers differences in coexpression networks, however, bypassing method-dependent steps of network inference. As an additional result from GSNCA, we define hub genes as genes with the largest weights and show that these genes correspond frequently to major and specific pathway regulators, as well as to genes that are most affected by the biological difference between two conditions. In summary, GSNCA is a new approach for the analysis of differentially coexpressed pathways that also evaluates the importance of the genes in the pathways, thus providing unique information that may result in the generation of novel biological hypotheses.
Resumo:
This paper proposes the use of an improved covariate unit root test which exploits the cross-sectional dependence information when the panel data null hypothesis of a unit root is rejected. More explicitly, to increase the power of the test, we suggest the utilization of more than one covariate and offer several ways to select the ‘best’ covariates from the set of potential covariates represented by the individuals in the panel. Employing our methods, we investigate the Prebish-Singer hypothesis for nine commodity prices. Our results show that this hypothesis holds for all but the price of petroleum.
Resumo:
Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.
Resumo:
In this paper the tracking system used to perform a scaled vehicle-barrier crash test is reported. The scaled crash test was performed as part of a wider project aimed at designing a new safety barrier making use of natural building materials. The scaled crash test was designed and performed as a proof of concept of the new mass-based safety barriers and the study was composed of two parts: the scaling technique and of a series of performed scaled crash tests. The scaling method was used for 1) setting the scaled test impact velocity so that energy dissipation and momentum transferring, from the car to the barrier, can be reproduced and 2) predicting the acceleration, velocity and displacement values occurring in the full-scale impact from the results obtained in a scaled test. To achieve this goal the vehicle and barrier displacements were to be recorded together with the vehicle accelerations and angular velocities. These quantities were measured during the tests using acceleration sensors and a tracking system. The tracking system was composed of a high speed camera and a set of targets to measure the vehicle linear and angular velocities. A code was developed to extract the target velocities from the videos and the velocities obtained were then compared with those obtained integrating the accelerations provided by the sensors to check the reliability of the method.
Resumo:
Dependency on thermal generation and continued wind power growth in Europe due to renewable energy and greenhouse gas emissions targets has resulted in an interesting set of challenges for power systems. The variability of wind power impacts dispatch and balancing by grid operators, power plant operations by generating companies and market wholesale costs. This paper quantifies the effects of high wind power penetration on power systems with a dependency on gas generation using a realistic unit commitment and economic dispatch model. The test system is analyzed under two scenarios, with and without wind, over one year. The key finding of this preliminary study is that despite increased ramping requirements in the wind scenario, the unit cost of electricity due to sub-optimal operation of gas generators does not show substantial deviation from the no wind scenario.
Resumo:
Background: The COMET (Core Outcome Measures in Effectiveness Trials) Initiative is developing a publicly accessible online resource to collate the knowledge base for core outcome set development (COS) and the applied work from different health conditions. Ensuring that the database is as comprehensive as possible and keeping it up to date are key to its value for users. This requires the development and application of an optimal, multi-faceted search strategy to identify relevant material. This paper describes the challenges of designing and implementing such a search, outlining the development of the search strategy for studies of COS development, and, in turn, the process for establishing a database of COS.
Methods: We investigated the performance characteristics of this strategy including sensitivity, precision and numbers needed to read. We compared the contribution of databases towards identifying included studies to identify the best combination of methods to retrieve all included studies.
Results: Recall of the search strategies ranged from 4% to 87%, and precision from 0.77% to 1.13%. MEDLINE performed best in terms of recall, retrieving 216 (87%) of the 250 included records, followed by Scopus (44%). The Cochrane Methodology Register found just 4% of the included records. MEDLINE was also the database with the highest precision. The number needed to read varied between 89 (MEDLINE) and 130 (SCOPUS).
Conclusions: We found that two databases and hand searching were required to locate all of the studies in this review. MEDLINE alone retrieved 87% of the included studies, but actually 97% of the included studies were indexed on MEDLINE. The Cochrane Methodology Register did not contribute any records that were not found in the other databases, and will not be included in our future searches to identify studies developing COS. SCOPUS had the lowest precision rate (0.77) and highest number needed to read (130). In future COMET searches for COS a balance needs to be struck between the work involved in screening large numbers of records, the frequency of the searching and the likelihood that eligible studies will be identified by means other than the database searches.
Resumo:
BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.
METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.
CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.
Resumo:
Boolean games are a framework for reasoning about the rational behaviour of agents, whose goals are formalized using propositional formulas. They offer an attractive alternative to normal-form games, because they allow for a more intuitive and more compact encoding. Unfortunately, however, there is currently no general, tailor-made method available to compute the equilibria of Boolean games. In this paper, we introduce a method for finding the pure Nash equilibria based on disjunctive answer set programming. Our method is furthermore capable of finding the core elements and the Pareto optimal equilibria, and can easily be modified to support other forms of optimality, thanks to the declarative nature of disjunctive answer set programming. Experimental results clearly demonstrate the effectiveness of the proposed method.
Resumo:
The research presented, investigates the optimal set of operational codes (opcodes) that create a robust indicator of malicious software (malware) and also determines a program’s execution duration for accurate classification of benign and malicious software. The features extracted from the dataset are opcode density histograms, extracted during the program execution. The classifier used is a support vector machine and is configured to select those features to produce the optimal classification of malware over different program run lengths. The findings demonstrate that malware can be detected using dynamic analysis with relatively few opcodes.