919 resultados para GRAPHICAL LASSO
Resumo:
The measurement of pavement roughness has been the concern of highway engineers for more than 70 years. This roughness is referred to as "riding quality" by the traveling public. Pavement roughness evaluating devices have attempted to place either a graphical or numerical value on the public's riding comfort or discomfort. Early graphical roughness recorders had many different designs. In 1900 an instrument called the "Viagraph" was developed by an Irish engineer.' The "Viagraph" consisted of a twelve foot board with graphical recorder drawn over the pavement. The "Profilometer" built in Illinois in 1922 was much more impressive. ' The instrument's recorder was mounted on a frame supported by 32 bicycle wheels mounted in tandem. Many other variations of profilometers with recorders were built but most were difficult to handle and could not secure uniformly reproducible results. The Bureau of Public Roads (BPR) Road Roughness Indicator b u i l t in 1941 is the most widely used numerical roughness recorder.' The BPR Road Roughness Indicator consists of a trailer unit with carefully selected springs, means of dampening, and balanced wheel.
Resumo:
A recurring task in the analysis of mass genome annotation data from high-throughput technologies is the identification of peaks or clusters in a noisy signal profile. Examples of such applications are the definition of promoters on the basis of transcription start site profiles, the mapping of transcription factor binding sites based on ChIP-chip data and the identification of quantitative trait loci (QTL) from whole genome SNP profiles. Input to such an analysis is a set of genome coordinates associated with counts or intensities. The output consists of a discrete number of peaks with respective volumes, extensions and center positions. We have developed for this purpose a flexible one-dimensional clustering tool, called MADAP, which we make available as a web server and as standalone program. A set of parameters enables the user to customize the procedure to a specific problem. The web server, which returns results in textual and graphical form, is useful for small to medium-scale applications, as well as for evaluation and parameter tuning in view of large-scale applications, requiring a local installation. The program written in C++ can be freely downloaded from ftp://ftp.epd.unil.ch/pub/software/unix/madap. The MADAP web server can be accessed at http://www.isrec.isb-sib.ch/madap/.
Resumo:
The dispersion of the samples in soil particle-size analysis is a fundamental step, which is commonly achieved with a combination of chemical agents and mechanical agitation. The purpose of this study was to evaluate the efficiency of a low-speed reciprocal shaker for the mechanical dispersion of soil samples of different textural classes. The particle size of 61 soil samples was analyzed in four replications, using the pipette method to determine the clay fraction and sieving to determine coarse, fine and total sand fractions. The silt content was obtained by difference. To evaluate the performance, the results of the reciprocal shaker (RSh) were compared with data of the same soil samples available in reports of the Proficiency testing for Soil Analysis Laboratories of the Agronomic Institute of Campinas (Prolab/IAC). The accuracy was analyzed based on the maximum and minimum values defining the confidence intervals for the particle-size fractions of each soil sample. Graphical indicators were also used for data comparison, based on dispersion and linear adjustment. The descriptive statistics indicated predominantly low variability in more than 90 % of the results for sand, medium-textured and clay samples, and for 68 % of the results for heavy clay samples, indicating satisfactory repeatability of measurements with the RSh. Medium variability was frequently associated with silt, followed by the fine sand fraction. The sensitivity analyses indicated an accuracy of 100 % for the three main separates (total sand, silt and clay), in all 52 samples of the textural classes heavy clay, clay and medium. For the nine sand soil samples, the average accuracy was 85.2 %; highest deviations were observed for the silt fraction. In relation to the linear adjustments, the correlation coefficients of 0.93 (silt) or > 0.93 (total sand and clay), as well as the differences between the angular coefficients and the unit < 0.16, indicated a high correlation between the reference data (Prolab/IAC) and results obtained with the RSh. In conclusion, the mechanical dispersion by the reciprocal shaker of soil samples of different textural classes was satisfactory. The results allowed recommending the use of the equipment at low agitation for particle size- analysis. The advantages of this Brazilian apparatus are its low cost, the possibility to simultaneously analyze a great number of samples using ordinary, easily replaceable glass or plastic bottles.
Resumo:
A geração e o armazenamento de resíduos de construção civil e demolição (RCD) constituem-se em um problema ambiental, pois representam mais de 50 % do total de resíduos sólidos gerados nos médios e grandes centros urbanos. Porém, como o calcário é uma das principais matérias-primas utilizadas na fabricação do cimento e da cal hidratada e, consequentemente, de concretos, argamassas e reboques, a reciclagem desses materiais pode fornecer, em princípio, um subproduto com potencial para correção da acidez dos solos. Este estudo teve o objetivo de avaliar a utilização de resíduos de construção e demolição reciclados como corretivos de acidez do solo. Utilizaram-se RCD-R provenientes de concretos, argamassas e reboques (material cinza), que foram caracterizados inicialmente por fluorescência e difratometria de raios-X. O desempenho dos RCD-R cinza como corretivo de acidez foi avaliado pela produção de matéria seca da alfafa (Medicago sativa cv. Crioula) e pela medida dos atributos químicos do solo. Os resultados sugerem que os RCD-R cinza (origem de concretos) apresentam características interessantes para utilização como corretivos da acidez de solos, porém em concentrações superiores a 24 t ha-1, quando aplicados em área total.
Resumo:
This paper applies probability and decision theory in the graphical interface of an influence diagram to study the formal requirements of rationality which justify the individualization of a person found through a database search. The decision-theoretic part of the analysis studies the parameters that a rational decision maker would use to individualize the selected person. The modeling part (in the form of an influence diagram) clarifies the relationships between this decision and the ingredients that make up the database search problem, i.e., the results of the database search and the different pairs of propositions describing whether an individual is at the source of the crime stain. These analyses evaluate the desirability associated with the decision of 'individualizing' (and 'not individualizing'). They point out that this decision is a function of (i) the probability that the individual in question is, in fact, at the source of the crime stain (i.e., the state of nature), and (ii) the decision maker's preferences among the possible consequences of the decision (i.e., the decision maker's loss function). We discuss the relevance and argumentative implications of these insights with respect to recent comments in specialized literature, which suggest points of view that are opposed to the results of our study.
Resumo:
The graphical representation of spatial soil properties in a digital environment is complex because it requires a conversion of data collected in a discrete form onto a continuous surface. The objective of this study was to apply three-dimension techniques of interpolation and visualization on soil texture and fertility properties and establish relationships with pedogenetic factors and processes in a slope area. The GRASS Geographic Information System was used to generate three-dimensional models and ParaView software to visualize soil volumes. Samples of the A, AB, BA, and B horizons were collected in a regular 122-point grid in an area of 13 ha, in Pinhais, PR, in southern Brazil. Geoprocessing and graphic computing techniques were effective in identifying and delimiting soil volumes of distinct ranges of fertility properties confined within the soil matrix. Both three-dimensional interpolation and the visualization tool facilitated interpretation in a continuous space (volumes) of the cause-effect relationships between soil texture and fertility properties and pedological factors and processes, such as higher clay contents following the drainage lines of the area. The flattest part with more weathered soils (Oxisols) had the highest pH values and lower Al3+ concentrations. These techniques of data interpolation and visualization have great potential for use in diverse areas of soil science, such as identification of soil volumes occurring side-by-side but that exhibit different physical, chemical, and mineralogical conditions for plant root growth, and monitoring of plumes of organic and inorganic pollutants in soils and sediments, among other applications. The methodological details for interpolation and a three-dimensional view of soil data are presented here.
Resumo:
DnaSP is a software package for the analysis of DNA polymorphism data. Present version introduces several new modules and features which, among other options allow: (1) handling big data sets (~5 Mb per sequence); (2) conducting a large number of coalescent-based tests by Monte Carlo computer simulations; (3) extensive analyses of the genetic differentiation and gene flow among populations; (4) analysing the evolutionary pattern of preferred and unpreferred codons; (5) generating graphical outputs for an easy visualization of results. Availability: The software package, including complete documentation and examples, is freely available to academic users from: http://www.ub.es/dnasp
Resumo:
BACKGROUND: DNA sequence polymorphisms analysis can provide valuable information on the evolutionary forces shaping nucleotide variation, and provides an insight into the functional significance of genomic regions. The recent ongoing genome projects will radically improve our capabilities to detect specific genomic regions shaped by natural selection. Current available methods and software, however, are unsatisfactory for such genome-wide analysis. RESULTS: We have developed methods for the analysis of DNA sequence polymorphisms at the genome-wide scale. These methods, which have been tested on a coalescent-simulated and actual data files from mouse and human, have been implemented in the VariScan software package version 2.0. Additionally, we have also incorporated a graphical-user interface. The main features of this software are: i) exhaustive population-genetic analyses including those based on the coalescent theory; ii) analysis adapted to the shallow data generated by the high-throughput genome projects; iii) use of genome annotations to conduct a comprehensive analyses separately for different functional regions; iv) identification of relevant genomic regions by the sliding-window and wavelet-multiresolution approaches; v) visualization of the results integrated with current genome annotations in commonly available genome browsers. CONCLUSION: VariScan is a powerful and flexible suite of software for the analysis of DNA polymorphisms. The current version implements new algorithms, methods, and capabilities, providing an important tool for an exhaustive exploratory analysis of genome-wide DNA polymorphism data.
Resumo:
The IRA and the NPL have submitted ampoules of 166Hom to the International Reference System (SIR) for activity comparison at the Bureau International des Poids et Mesures, thus becoming the third and fourth participants since 1989. The five samples of known activity of 166Hom now recorded in the SIR have activities between about 70 kBq and 500 kBq. The new results have enabled a re-evaluation of the key comparison reference value, and the degrees of equivalence between each equivalent activity measured in the SIR and the key comparison reference value (KCRV) have been calculated. The results are given in the form of a matrix for these four NMIs together with the recalculated degrees of equivalence of an APMP regional comparison held in 2000, comparison identifier APMP.RI(II)-K2.Ho-166m for six other NMIs. A graphical presentation is also given.
Resumo:
Amplified Fragment Length Polymorphisms (AFLPs) are a cheap and efficient protocol for generating large sets of genetic markers. This technique has become increasingly used during the last decade in various fields of biology, including population genomics, phylogeography, and genome mapping. Here, we present RawGeno, an R library dedicated to the automated scoring of AFLPs (i.e., the coding of electropherogram signals into ready-to-use datasets). Our program includes a complete suite of tools for binning, editing, visualizing, and exporting results obtained from AFLP experiments. RawGeno can either be used with command lines and program analysis routines or through a user-friendly graphical user interface. We describe the whole RawGeno pipeline along with recommendations for (a) setting the analysis of electropherograms in combination with PeakScanner, a program freely distributed by Applied Biosystems; (b) performing quality checks; (c) defining bins and proceeding to scoring; (d) filtering nonoptimal bins; and (e) exporting results in different formats.
Resumo:
This paper analyses and discusses arguments that emerge from a recent discussion about the proper assessment of the evidential value of correspondences observed between the characteristics of a crime stain and those of a sample from a suspect when (i) this latter individual is found as a result of a database search and (ii) remaining database members are excluded as potential sources (because of different analytical characteristics). Using a graphical probability approach (i.e., Bayesian networks), the paper here intends to clarify that there is no need to (i) introduce a correction factor equal to the size of the searched database (i.e., to reduce a likelihood ratio), nor to (ii) adopt a propositional level not directly related to the suspect matching the crime stain (i.e., a proposition of the kind 'some person in (outside) the database is the source of the crime stain' rather than 'the suspect (some other person) is the source of the crime stain'). The present research thus confirms existing literature on the topic that has repeatedly demonstrated that the latter two requirements (i) and (ii) should not be a cause of concern.
Resumo:
The present paper focuses on the analysis and discussion of a likelihood ratio (LR) development for propositions at a hierarchical level known in the context as 'offence level'. Existing literature on the topic has considered LR developments for so-called offender to scene transfer cases. These settings involve-in their simplest form-a single stain found on a crime scene, but with possible uncertainty about the degree to which that stain is relevant (i.e. that it has been left by the offender). Extensions to multiple stains or multiple offenders have also been reported. The purpose of this paper is to discuss a development of a LR for offence level propositions when case settings involve potential transfer in the opposite direction, i.e. victim/scene to offender transfer. This setting has previously not yet been considered. The rationale behind the proposed LR is illustrated through graphical probability models (i.e. Bayesian networks). The role of various uncertain parameters is investigated through sensitivity analyses as well as simulations.
Resumo:
Glandora oleifolia (Lapeyr.) D.C.Thomas [= Lithodora oleifolia (Lapeyr.) Griseb.] constituye uno de los endemismos más restringidos de la península Ibérica, puesto que cuenta con una única población (dividida en dos núcleos separados por apenas 5 km) localizada en la Alta Garrotxa, en el Pirineo gerundense. Glandora oleifolia está catalogada como VU en la Lista Roja 2008 de la Flora Vascular Española y goza de protección legal en Cataluña (está incluida en el reciente Catálogo de flora amenazada de esta comunidad autónoma). Como parte de un plan de conservación ex situ que está elaborando el Jardín Botánico Marimurtra (JBMiM) para este taxón amenazado, se ha estudiado su diversidad genética mediante aloenzimas y RAPDs. Para ambos marcadores, losniveles de variabilidad genética intrapoblacional mostraron unos valores muy pequeños, tal y como se espera para los endemismos de área muy reducida. La divergencia genética entre los dos núcleos poblacionales también resultó ser muy reducida, inferior al 10% con ambos marcadores. La exigua área de distribución de esta borraginácea, a pesar de su moderado tamaño poblacional (unos 5.000 individuos), junto con tan pequeña variabilidad genética, la convierte en un taxón con un elevado riesgo de extinción. Por otra parte, la especie está sometida a una enorme presión antrópica (sobrefrecuentación), mientras que la baja producción de semillas debido a la escasa actividad de los polinizadores constituye una amenaza adicional.
Resumo:
Researchers should continuously ask how to improve the models we rely on to make financial decisions in terms of the planning, design, construction, and maintenance of roadways. This project presents an alternative tool that will supplement local decision making but maintain a full appreciation of the complexity and sophistication of today’s regional model and local traffic impact study methodologies. This alternative method is tailored to the desires of local agencies, which requested a better, faster, and easier way to evaluate land uses and their impact on future traffic demands at the sub-area or project corridor levels. A particular emphasis was placed on scenario planning for currently undeveloped areas. The scenario planning tool was developed using actual land use and roadway information for the communities of Johnston and West Des Moines, Iowa. Both communities used the output from this process to make regular decisions regarding infrastructure investment, design, and land use planning. The City of Johnston case study included forecasting future traffic for the western portion of the city within a 2,600-acre area, which included 42 intersections. The City of West Des Moines case study included forecasting future traffic for the city’s western growth area covering over 30,000 acres and 331 intersections. Both studies included forecasting a.m. and p.m. peak-hour traffic volumes based upon a variety of different land use scenarios. The tool developed took goegraphic information system (GIS)-based parcel and roadway information, converted the data into a graphical spreadsheet tool, allowed the user to conduct trip generation, distribution, and assignment, and then to automatically convert the data into a Synchro roadway network which allows for capacity analysis and visualization. The operational delay outputs were converted back into a GIS thematic format for contrast and further scenario planning. This project has laid the groundwork for improving both planning and civil transportation decision making at the sub-regional, super-project level.
Resumo:
The Iowa Department of Transportation is committed to improved management systems, which in turn has led to increased automation to record and manage construction data. A possible improvement to the current data management system can be found with pen-based computers. Pen-based computers coupled with user friendly software are now to the point where an individual's handwriting can be captured and converted to typed text to be used for data collection. It would appear pen-based computers are sufficiently advanced to be used by construction inspectors to record daily project data. The objective of this research was to determine: (1) if pen-based computers are durable enough to allow maintenance-free operation for field work during Iowa's construction season; and (2) if pen-based computers can be used effectively by inspectors with little computer experience. The pen-based computer's handwriting recognition was not fast or accurate enough to be successfully utilized. The IBM Thinkpad with the pen pointing device did prove useful for working in Windows' graphical environment. The pen was used for pointing, selecting and scrolling in the Windows applications because of its intuitive nature.