929 resultados para Source Code Analysis
Resumo:
Ney is an end-blown flute which is mainly used for Makam music. Although from the beginning of 20th century a score representation based on extending the Western musicis used, because of its rich articulation repertoire, actualNey music can not be totally represented by written score.Ney is still taught and transmitted orally in Turkey. Becauseof that the performance has a distinct and importantrole in Ney music. Therefore signal analysis of ney performancesis crucial for understanding the actual music.Another important aspect which is also a part of the performanceis the articulations that performers apply. In Makam music in Turkey none of the articulations are taught evennamed by teachers. Articulations in Ney are valuable for understanding the real performance. Since articulations are not taught and their places are not marked in the score, the choice and character of the articulation is unique for eachperformer which also makes each performance unique.Our method analyzes audio files of well known Turkish Ney players. In order to obtain our analysis data, we analyzed audio files of 8 different performers vary from 1920to 2000.
Resumo:
Accurate detection of subpopulation size determinations in bimodal populations remains problematic yet it represents a powerful way by which cellular heterogeneity under different environmental conditions can be compared. So far, most studies have relied on qualitative descriptions of population distribution patterns, on population-independent descriptors, or on arbitrary placement of thresholds distinguishing biological ON from OFF states. We found that all these methods fall short of accurately describing small population sizes in bimodal populations. Here we propose a simple, statistics-based method for the analysis of small subpopulation sizes for use in the free software environment R and test this method on real as well as simulated data. Four so-called population splitting methods were designed with different algorithms that can estimate subpopulation sizes from bimodal populations. All four methods proved more precise than previously used methods when analyzing subpopulation sizes of transfer competent cells arising in populations of the bacterium Pseudomonas knackmussii B13. The methods' resolving powers were further explored by bootstrapping and simulations. Two of the methods were not severely limited by the proportions of subpopulations they could estimate correctly, but the two others only allowed accurate subpopulation quantification when this amounted to less than 25% of the total population. In contrast, only one method was still sufficiently accurate with subpopulations smaller than 1% of the total population. This study proposes a number of rational approximations to quantifying small subpopulations and offers an easy-to-use protocol for their implementation in the open source statistical software environment R.
Resumo:
Deciding whether two fingerprint marks originate from the same source requires examination and comparison of their features. Many cognitive factors play a major role in such information processing. In this paper we examined the consistency (both between- and within-experts) in the analysis of latent marks, and whether the presence of a 'target' comparison print affects this analysis. Our findings showed that the context of a comparison print affected analysis of the latent mark, possibly influencing allocation of attention, visual search, and threshold for determining a 'signal'. We also found that even without the context of the comparison print there was still a lack of consistency in analysing latent marks. Not only was this reflected by inconsistency between different experts, but the same experts at different times were inconsistent with their own analysis. However, the characterization of these inconsistencies depends on the standard and definition of what constitutes inconsistent. Furthermore, these effects were not uniform; the lack of consistency varied across fingerprints and experts. We propose solutions to mediate variability in the analysis of friction ridge skin.
Resumo:
OBJECTIVE: HIV-1 post-exposure prophylaxis (PEP) is frequently prescribed after exposure to source persons with an undetermined HIV serostatus. To reduce unnecessary use of PEP, we implemented a policy including active contacting of source persons and the availability of free, anonymous HIV testing ('PEP policy'). METHODS: All consultations for potential non-occupational HIV exposures i.e. outside the medical environment) were prospectively recorded. The impact of the PEP policy on PEP prescription and costs was analysed and modelled. RESULTS: Among 146 putative exposures, 47 involved a source person already known to be HIV positive and 23 had no indication for PEP. The remaining 76 exposures involved a source person of unknown HIV serostatus. Of 33 (43.4%) exposures for which the source person could be contacted and tested, PEP was avoided in 24 (72.7%), initiated and discontinued in seven (21.2%), and prescribed and completed in two (6.1%). In contrast, of 43 (56.6%) exposures for which the source person could not be tested, PEP was prescribed in 35 (81.4%), P < 0.001. Upon modelling, the PEP policy allowed a 31% reduction of cost for management of exposures to source persons of unknown HIV serostatus. The policy was cost-saving for HIV prevalence of up to 70% in the source population. The availability of all the source persons for testing would have reduced cost by 64%. CONCLUSION: In the management of non-occupational HIV exposures, active contacting and free, anonymous testing of source persons proved feasible. This policy resulted in a decrease in prescription of PEP, proved to be cost-saving, and presumably helped to avoid unnecessary toxicity and psychological stress.
Resumo:
The present study is an analysis of IR sources in the Alpha Persei open cluster region from the IRAS Point Source Catalog and from ground-based photometric observations. Cross-identification between stars in the region and IRAS Point Source Catalog was performed and nine new associations were found. BVRI Johnson photometry for 24 of the matched objects have been carried out. Physical identity of visual and IRAS sources and relationship to the Alpha Persei open cluster are discussed.
Resumo:
The COMPTEL unidentified source GRO J1411-64 was observed by INTEGRAL, and its central part, also by XMM-Newton. The data analysis shows no hint for new detections at hard X-rays. The upper limits in flux herein presented constrain the energy spectrum of whatever was producing GRO J1411-64, imposing, in the framework of earlier COMPTEL observations, the existence of a peak in power output located somewhere between 300-700 keV for the so-called low state. The Circinus Galaxy is the only source detected within the 4$\sigma$ location error of GRO J1411-64, but can be safely excluded as the possible counterpart: the extrapolation of the energy spectrum is well below the one for GRO J1411-64 at MeV energies. 22 significant sources (likelihood $> 10$) were extracted and analyzed from XMM-Newton data. Only one of these sources, XMMU J141255.6-635932, is spectrally compatible with GRO J1411-64 although the fact the soft X-ray observations do not cover the full extent of the COMPTEL source position uncertainty make an association hard to quantify and thus risky. The unique peak of the power output at high energies (hard X-rays and gamma-rays) resembles that found in the SED seen in blazars or microquasars. However, an analysis using a microquasar model consisting on a magnetized conical jet filled with relativistic electrons which radiate through synchrotron and inverse Compton scattering with star, disk, corona and synchrotron photons shows that it is hard to comply with all observational constrains. This and the non-detection at hard X-rays introduce an a-posteriori question mark upon the physical reality of this source, which is discussed in some detail.
Resumo:
Leakage detection is an important issue in many chemical sensing applications. Leakage detection hy thresholds suffers from important drawbacks when sensors have serious drifts or they are affected by cross-sensitivities. Here we present an adaptive method based in a Dynamic Principal Component Analysis that models the relationships between the sensors in the may. In normal conditions a certain variance distribution characterizes sensor signals. However, in the presence of a new source of variance the PCA decomposition changes drastically. In order to prevent the influence of sensor drifts the model is adaptive and it is calculated in a recursive manner with minimum computational effort. The behavior of this technique is studied with synthetic signals and with real signals arising by oil vapor leakages in an air compressor. Results clearly demonstrate the efficiency of the proposed method.
Resumo:
This article analyses and discusses issues that pertain to the choice of relevant databases for assigning values to the components of evaluative likelihood ratio procedures at source level. Although several formal likelihood ratio developments currently exist, both case practitioners and recipients of expert information (such as judiciary) may be reluctant to consider them as a framework for evaluating scientific evidence in context. The recent ruling R v T and ensuing discussions in many forums provide illustrative examples for this. In particular, it is often felt that likelihood ratio-based reasoning amounts to an application that requires extensive quantitative information along with means for dealing with technicalities related to the algebraic formulation of these approaches. With regard to this objection, this article proposes two distinct discussions. In a first part, it is argued that, from a methodological point of view, there are additional levels of qualitative evaluation that are worth considering prior to focusing on particular numerical probability assignments. Analyses will be proposed that intend to show that, under certain assumptions, relative numerical values, as opposed to absolute values, may be sufficient to characterize a likelihood ratio for practical and pragmatic purposes. The feasibility of such qualitative considerations points out that the availability of hard numerical data is not a necessary requirement for implementing a likelihood ratio approach in practice. It is further argued that, even if numerical evaluations can be made, qualitative considerations may be valuable because they can further the understanding of the logical underpinnings of an assessment. In a second part, the article will draw a parallel to R v T by concentrating on a practical footwear mark case received at the authors' institute. This case will serve the purpose of exemplifying the possible usage of data from various sources in casework and help to discuss the difficulty associated with reconciling the depth of theoretical likelihood ratio developments and limitations in the degree to which these developments can actually be applied in practice.
Resumo:
Alzheimer's disease (AD) disrupts functional connectivity in distributed cortical networks. We analyzed changes in the S-estimator, a measure of multivariate intraregional synchronization, in electroencephalogram (EEG) source space in 15 mild AD patients versus 15 age-matched controls to evaluate its potential as a marker of AD progression. All participants underwent 2 clinical evaluations and 2 EEG recording sessions on diagnosis and after a year. The main effect of AD was hyposynchronization in the medial temporal and frontal regions and relative hypersynchronization in posterior cingulate, precuneus, cuneus, and parietotemporal cortices. However, the S-estimator did not change over time in either group. This result motivated an analysis of rapidly progressing AD versus slow-progressing patients. Rapidly progressing AD patients showed a significant reduction in synchronization with time, manifest in left frontotemporal cortex. Thus, the evolution of source EEG synchronization over time is correlated with the rate of disease progression and should be considered as a cost-effective AD biomarker.
Resumo:
The transportation system is in demand 24/7 and 365 days a year irrespective of neither the weather nor the conditions. Iowa’s transportation system is an integral and essential part of society serving commerce and daily functions of all Iowans across the state. A high quality transportation system serves as the artery for economic activity and, the condition of the infrastructure is a key element for our future growth opportunities. A key component of Iowa’s transportation system is the public roadway system owned and maintained by the state, cities and counties. In order to regularly re-evaluate the conditions of Iowa’s public roadway infrastructure and assess the ability of existing revenues to meet the needs of the system, the Iowa Department of Transportation’s 2006 Road Use Tax Fund (RUTF) report to the legislature included a recommendation that a study be conducted every five years. That recommendation was included in legislation adopted in 2007 and signed into law. The law specifically requires the following (2011 Iowa Code Section 307.31): •“The department shall periodically review the current revenue levels of the road use tax fund and the sufficiency of those revenues for the projected construction and maintenance needs of city, county, and state governments in the future. The department shall submit a written report to the general assembly regarding its findings by December 31 every five years, beginning in 2011. The report may include recommendations concerning funding levels needed to support the future mobility and accessibility for users of Iowa's public road system.” •“The department shall evaluate alternative funding sources for road maintenance and construction and report to the general assembly at least every five years on the advantages and disadvantages and the viability of alternative funding mechanisms.” Consistent with this requirement, the Iowa Department of Transportation (DOT) has prepared this study. Recognizing the importance of actively engaging with the public and transportation stakeholders in any discussion of public roadway conditions and needs, Governor Terry E. Branstad announced on March 8, 2011, the creation of, and appointments to, the Governor’s Transportation 2020 Citizen Advisory Commission (CAC). The CAC was tasked with assisting the Iowa DOT as they assess the condition of Iowa’s roadway system and evaluate current and future funding available to best address system needs. In particular the CAC was directed to gather input from the public and stakeholders regarding the condition of Iowa’s public roadway system, the impact of that system, whether additional funding is needed to maintain/improve the system, and, if so, what funding mechanisms ought to be considered. With this input, the CAC prepared a report and recommendations that were presented to Governor Branstad and the Iowa DOT in November 2011 for use in the development of this study. The CAC’s report is available at www.iowadot.gov/transportation2020/pdfs/CAC%20REPORT%20FINAL%20110211.pdf. The CAC’s report was developed utilizing analysis and information from the Iowa DOT. Therefore, the report forms the basis for this study and the two documents are very similar. Iowa is fortunate to have an extensive public roadway system that provides access to all areas of the state and facilitates the efficient movement of goods and people. However, it is also a tremendous challenge for the state, cities and counties to maintain and improve this system given flattening revenue, lost buying power, changing demands on the system, severe weather, and an aging system. This challenge didn’t appear overnight and for the last decade many studies have been completed to look into the situation and the legislature has taken significant action to begin addressing the situation. In addition, the Iowa DOT and Iowa’s cities and counties have worked jointly and independently to increase efficiency and streamline operations. All of these actions have been successful and resulted in significant changes; however, it is apparent much more needs to be done. A well-maintained, high-quality transportation system reduces transportation costs and provides consistent and reliable service. These are all factors that are critical in the evaluation companies undertake when deciding where to expand or locate new developments. The CAC and Iowa DOT heard from many Iowans that additional investment in Iowa’s roadway system is vital to support existing jobs and continued job creation in the state of Iowa. Beginning June 2011, the CAC met regularly to review material and discuss potential recommendations to address Iowa’s roadway funding challenges. This effort included extensive public outreach with meetings held in seven locations across Iowa and through a Transportation 2020 website hosted by the Iowa DOT (www.iowadot.gov/transportation2020). Over 500 people attended the public meetings held through the months of August and September, with 198 providing verbal or written comment at the meetings or through the website. Comments were received from a wide array of individuals. The public comments demonstrated overwhelming support for increased funding for Iowa’s roads. Through the public input process, several guiding principles were established to guide the development of recommendations. Those guiding principles are: • Additional revenues are restricted for road and bridge improvements only, like 95 percent of the current state road revenue is currently. This includes the fuel tax and registration fees. • State and local governments continue to streamline and become more efficient, both individually and by looking for ways to do things collectively. • User fee concept is preserved, where those who use the roads pay for them, including non¬residents. • Revenue-generating methods equitable across users. • Increase revenue generating mechanisms that are viable now but begin to implement and set the stage for longer-term solutions that bring equity and stability to road funding. • Continue Iowa’s long standing tradition of state roadway financing coming from pay-as-you-go financing. Iowa must not fall into the situation that other states are currently facing where the majority of their new program dollars are utilized to pay the debt service of past bonding. Based on the analysis of Iowa’s public roadway needs and revenue and the extensive work of the Governor’s Transportation 2020 Citizen Advisory Commission, the Iowa DOT has identified specific recommendations. The recommendations follow very closely the recommendations of the CAC (CAC recommendations from their report are repeated in Appendix B). Following is a summary of the recommendations which are fully documented beginning on page 21. 1. Through a combination of efficiency savings and increased revenue, a minimum of $215 million of revenue per year should be generated to meet Iowa’s critical roadway needs. 2. The Code of Iowa should be changed to require the study of the sufficiency of the state’s road funds to meet the road system’s needs every two years instead of every five years to coincide with the biennial legislative budget appropriation schedule. 3.Modify the current registration fee for electric vehicles to be based on weight and value using the same formula that applies to most passenger vehicles. 4.Consistent with existing Code of Iowa requirements, new funding should go to the TIME-21 Fund up to the cap ($225 million) and remaining new funding should be distributed consistent with the Road Use Tax Fund distribution formula. 5.The CAC recommended the Iowa DOT at least annually convene meetings with cities and counties to review the operation, maintenance and improvement of Iowa’s public roadway system to identify ways to jointly increase efficiency. In direct response to this recommendation, Governor Branstad directed the Iowa DOT to begin this effort immediately with a target of identifying $50 million of efficiency savings that can be captured from the over $1 billion of state revenue already provided to the Iowa DOT and Iowa’s cities and counties to administer, maintain and improve Iowa’s public roadway system. This would build upon past joint and individual actions that have reduced administrative costs and resulted in increased funding for improvement of Iowa’s public roadway system. Efficiency actions should be quantified, measured and reported to the public on a regular basis. 6.By June 30, 2012, Iowa DOT should complete a study of vehicles and equipment that use Iowa’s public roadway system but pay no user fees or substantially lower user fees than other vehicles and equipment.
Resumo:
Soil infiltration is a key link of the natural water cycle process. Studies on soil permeability are conducive for water resources assessment and estimation, runoff regulation and management, soil erosion modeling, nonpoint and point source pollution of farmland, among other aspects. The unequal influence of rainfall duration, rainfall intensity, antecedent soil moisture, vegetation cover, vegetation type, and slope gradient on soil cumulative infiltration was studied under simulated rainfall and different underlying surfaces. We established a six factor-model of soil cumulative infiltration by the improved back propagation (BP)-based artificial neural network algorithm with a momentum term and self-adjusting learning rate. Compared to the multiple nonlinear regression method, the stability and accuracy of the improved BP algorithm was better. Based on the improved BP model, the sensitive index of these six factors on soil cumulative infiltration was investigated. Secondly, the grey relational analysis method was used to individually study grey correlations among these six factors and soil cumulative infiltration. The results of the two methods were very similar. Rainfall duration was the most influential factor, followed by vegetation cover, vegetation type, rainfall intensity and antecedent soil moisture. The effect of slope gradient on soil cumulative infiltration was not significant.
Resumo:
During my PhD, my aim was to provide new tools to increase our capacity to analyse gene expression patterns, and to study on a large-scale basis the evolution of gene expression in animals. Gene expression patterns (when and where a gene is expressed) are a key feature in understanding gene function, notably in development. It appears clear now that the evolution of developmental processes and of phenotypes is shaped both by evolution at the coding sequence level, and at the gene expression level.Studying gene expression evolution in animals, with complex expression patterns over tissues and developmental time, is still challenging. No tools are available to routinely compare expression patterns between different species, with precision, and on a large-scale basis. Studies on gene expression evolution are therefore performed only on small genes datasets, or using imprecise descriptions of expression patterns.The aim of my PhD was thus to develop and use novel bioinformatics resources, to study the evolution of gene expression. To this end, I developed the database Bgee (Base for Gene Expression Evolution). The approach of Bgee is to transform heterogeneous expression data (ESTs, microarrays, and in-situ hybridizations) into present/absent calls, and to annotate them to standard representations of anatomy and development of different species (anatomical ontologies). An extensive mapping between anatomies of species is then developed based on hypothesis of homology. These precise annotations to anatomies, and this extensive mapping between species, are the major assets of Bgee, and have required the involvement of many co-workers over the years. My main personal contribution is the development and the management of both the Bgee database and the web-application.Bgee is now on its ninth release, and includes an important gene expression dataset for 5 species (human, mouse, drosophila, zebrafish, Xenopus), with the most data from mouse, human and zebrafish. Using these three species, I have conducted an analysis of gene expression evolution after duplication in vertebrates.Gene duplication is thought to be a major source of novelty in evolution, and to participate to speciation. It has been suggested that the evolution of gene expression patterns might participate in the retention of duplicate genes. I performed a large-scale comparison of expression patterns of hundreds of duplicated genes to their singleton ortholog in an outgroup, including both small and large-scale duplicates, in three vertebrate species (human, mouse and zebrafish), and using highly accurate descriptions of expression patterns. My results showed unexpectedly high rates of de novo acquisition of expression domains after duplication (neofunctionalization), at least as high or higher than rates of partitioning of expression domains (subfunctionalization). I found differences in the evolution of expression of small- and large-scale duplicates, with small-scale duplicates more prone to neofunctionalization. Duplicates with neofunctionalization seemed to evolve under more relaxed selective pressure on the coding sequence. Finally, even with abundant and precise expression data, the majority fate I recovered was neither neo- nor subfunctionalization of expression domains, suggesting a major role for other mechanisms in duplicate gene retention.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
This paper describes methods to analyze the brain's electric fields recorded with multichannel Electroencephalogram (EEG) and demonstrates their implementation in the software CARTOOL. It focuses on the analysis of the spatial properties of these fields and on quantitative assessment of changes of field topographies across time, experimental conditions, or populations. Topographic analyses are advantageous because they are reference independents and thus render statistically unambiguous results. Neurophysiologically, differences in topography directly indicate changes in the configuration of the active neuronal sources in the brain. We describe global measures of field strength and field similarities, temporal segmentation based on topographic variations, topographic analysis in the frequency domain, topographic statistical analysis, and source imaging based on distributed inverse solutions. All analysis methods are implemented in a freely available academic software package called CARTOOL. Besides providing these analysis tools, CARTOOL is particularly designed to visualize the data and the analysis results using 3-dimensional display routines that allow rapid manipulation and animation of 3D images. CARTOOL therefore is a helpful tool for researchers as well as for clinicians to interpret multichannel EEG and evoked potentials in a global, comprehensive, and unambiguous way.
Resumo:
In October 1998, Hurricane Mitch triggered numerous landslides (mainly debris flows) in Honduras and Nicaragua, resulting in a high death toll and in considerable damage to property. The potential application of relatively simple and affordable spatial prediction models for landslide hazard mapping in developing countries was studied. Our attention was focused on a region in NW Nicaragua, one of the most severely hit places during the Mitch event. A landslide map was obtained at 1:10 000 scale in a Geographic Information System (GIS) environment from the interpretation of aerial photographs and detailed field work. In this map the terrain failure zones were distinguished from the areas within the reach of the mobilized materials. A Digital Elevation Model (DEM) with 20 m×20 m of pixel size was also employed in the study area. A comparative analysis of the terrain failures caused by Hurricane Mitch and a selection of 4 terrain factors extracted from the DEM which, contributed to the terrain instability, was carried out. Land propensity to failure was determined with the aid of a bivariate analysis and GIS tools in a terrain failure susceptibility map. In order to estimate the areas that could be affected by the path or deposition of the mobilized materials, we considered the fact that under intense rainfall events debris flows tend to travel long distances following the maximum slope and merging with the drainage network. Using the TauDEM extension for ArcGIS software we generated automatically flow lines following the maximum slope in the DEM starting from the areas prone to failure in the terrain failure susceptibility map. The areas crossed by the flow lines from each terrain failure susceptibility class correspond to the runout susceptibility classes represented in a runout susceptibility map. The study of terrain failure and runout susceptibility enabled us to obtain a spatial prediction for landslides, which could contribute to landslide risk mitigation.