83 resultados para Blog datasets
Resumo:
This paper introduces the approach of using Total Unduplicated Reach and Frequency analysis (TURF) to design a product line through a binary linear programming model. This improves the efficiency of the search for the solution to the problem compared to the algorithms that have been used to date. The results obtained through our exact algorithm are presented, and this method shows to be extremely efficient both in obtaining optimal solutions and in computing time for very large instances of the problem at hand. Furthermore, the proposed technique enables the model to be improved in order to overcome the main drawbacks presented by TURF analysis in practice.
Resumo:
Age data frequently display excess frequencies at round or attractive ages, such as even numbers and multiples of five. This phenomenon of age heaping has been viewed as a problem in previous research, especially in demography and epidemiology. We see it as an opportunity and propose its use as a measure of human capital that can yield comparable estimates across a wide range of historical contexts. A simulation study yields methodological guidelines for measuring and interpreting differences in ageheaping, while analysis of contemporary and historical datasets demonstrates the existence of a robust correlation between age heaping and literacy at both the individual and aggregate level. To illustrate the method, we generate estimates of human capital in Europe over the very long run, which support the hypothesis of a major increase in human capital preceding the industrial revolution.
Resumo:
We summarize the progress in whole-genome sequencing and analyses of primate genomes. These emerging genome datasets have broadened our understanding of primate genome evolution revealing unexpected and complex patterns of evolutionary change. This includes the characterization of genome structural variation, episodic changes in the repeat landscape, differences in gene expression, new models regarding speciation, and the ephemeral nature of the recombination landscape. The functional characterization of genomic differences important in primate speciation and adaptation remains a significant challenge. Limited access to biological materials, the lack of detailed phenotypic data and the endangered status of many critical primate species have significantly attenuated research into the genetic basis of primate evolution. Next-generation sequencing technologies promise to greatly expand the number of available primate genome sequences; however, such draft genome sequences will likely miss critical genetic differences within complex genomic regions unless dedicated efforts are put forward to understand the full spectrum of genetic variation.
Resumo:
We combine existing balance sheet and stock market data with two new datasets to studywhether, how much, and why bank lending to firms matters for the transmission of monetarypolicy. The first new dataset enables us to quantify the bank dependence of firms precisely,as the ratio of bank debt to total assets. We show that a two standard deviation increase inthe bank dependence of a firm makes its stock price about 25% more responsive to monetarypolicy shocks. We explore the channels through which this effect occurs, and find that thestock prices of bank-dependent firms that borrow from financially weaker banks display astronger sensitivity to monetary policy shocks. This finding is consistent with the banklending channel, a theory according to which the strength of bank balance sheets mattersfor monetary policy transmission. We construct a new database of hedging activities andshow that the stock prices of bank-dependent firms that hedge against interest rate riskdisplay a lower sensitivity to monetary policy shocks. This finding is consistent with aninterest rate pass-through channel that operates via the direct transmission of policy ratesto lending rates associated with the widespread use of floating-rates in bank loans and creditline agreements.
Resumo:
Ground clutter caused by anomalous propagation (anaprop) can affect seriously radar rain rate estimates, particularly in fully automatic radar processing systems, and, if not filtered, can produce frequent false alarms. A statistical study of anomalous propagation detected from two operational C-band radars in the northern Italian region of Emilia Romagna is discussed, paying particular attention to its diurnal and seasonal variability. The analysis shows a high incidence of anaprop in summer, mainly in the morning and evening, due to the humid and hot summer climate of the Po Valley, particularly in the coastal zone. Thereafter, a comparison between different techniques and datasets to retrieve the vertical profile of the refractive index gradient in the boundary layer is also presented. In particular, their capability to detect anomalous propagation conditions is compared. Furthermore, beam path trajectories are simulated using a multilayer ray-tracing model and the influence of the propagation conditions on the beam trajectory and shape is examined. High resolution radiosounding data are identified as the best available dataset to reproduce accurately the local propagation conditions, while lower resolution standard TEMP data suffers from interpolation degradation and Numerical Weather Prediction model data (Lokal Model) are able to retrieve a tendency to superrefraction but not to detect ducting conditions. Observing the ray tracing of the centre, lower and upper limits of the radar antenna 3-dB half-power main beam lobe it is concluded that ducting layers produce a change in the measured volume and in the power distribution that can lead to an additional error in the reflectivity estimate and, subsequently, in the estimated rainfall rate.
Resumo:
Objective: Health status measures usually have an asymmetric distribution and present a highpercentage of respondents with the best possible score (ceiling effect), specially when they areassessed in the overall population. Different methods to model this type of variables have beenproposed that take into account the ceiling effect: the tobit models, the Censored Least AbsoluteDeviations (CLAD) models or the two-part models, among others. The objective of this workwas to describe the tobit model, and compare it with the Ordinary Least Squares (OLS) model,that ignores the ceiling effect.Methods: Two different data sets have been used in order to compare both models: a) real datacomming from the European Study of Mental Disorders (ESEMeD), in order to model theEQ5D index, one of the measures of utilities most commonly used for the evaluation of healthstatus; and b) data obtained from simulation. Cross-validation was used to compare thepredicted values of the tobit model and the OLS models. The following estimators werecompared: the percentage of absolute error (R1), the percentage of squared error (R2), the MeanSquared Error (MSE) and the Mean Absolute Prediction Error (MAPE). Different datasets werecreated for different values of the error variance and different percentages of individuals withceiling effect. The estimations of the coefficients, the percentage of explained variance and theplots of residuals versus predicted values obtained under each model were compared.Results: With regard to the results of the ESEMeD study, the predicted values obtained with theOLS model and those obtained with the tobit models were very similar. The regressioncoefficients of the linear model were consistently smaller than those from the tobit model. In thesimulation study, we observed that when the error variance was small (s=1), the tobit modelpresented unbiased estimations of the coefficients and accurate predicted values, specially whenthe percentage of individuals wiht the highest possible score was small. However, when theerrror variance was greater (s=10 or s=20), the percentage of explained variance for the tobitmodel and the predicted values were more similar to those obtained with an OLS model.Conclusions: The proportion of variability accounted for the models and the percentage ofindividuals with the highest possible score have an important effect in the performance of thetobit model in comparison with the linear model.
Resumo:
During infection with human immunodeficiency virus (HIV), immune pressure from cytotoxic T-lymphocytes (CTLs) selects for viral mutants that confer escape from CTL recognition. These escape variants can be transmitted between individuals where, depending upon their cost to viral fitness and the CTL responses made by the recipient, they may revert. The rates of within-host evolution and their concordant impact upon the rate of spread of escape mutants at the population level are uncertain. Here we present a mathematical model of within-host evolution of escape mutants, transmission of these variants between hosts and subsequent reversion in new hosts. The model is an extension of the well-known SI model of disease transmission and includes three further parameters that describe host immunogenetic heterogeneity and rates of within host viral evolution. We use the model to explain why some escape mutants appear to have stable prevalence whilst others are spreading through the population. Further, we use it to compare diverse datasets on CTL escape, highlighting where different sources agree or disagree on within-host evolutionary rates. The several dozen CTL epitopes we survey from HIV-1 gag, RT and nef reveal a relatively sedate rate of evolution with average rates of escape measured in years and reversion in decades. For many epitopes in HIV, occasional rapid within-host evolution is not reflected in fast evolution at the population level.
Resumo:
Terrestrial laser scanning (TLS) is one of the most promising surveying techniques for rockslope characterization and monitoring. Landslide and rockfall movements can be detected by means of comparison of sequential scans. One of the most pressing challenges of natural hazards is combined temporal and spatial prediction of rockfall. An outdoor experiment was performed to ascertain whether the TLS instrumental error is small enough to enable detection of precursory displacements of millimetric magnitude. This consists of a known displacement of three objects relative to a stable surface. Results show that millimetric changes cannot be detected by the analysis of the unprocessed datasets. Displacement measurement are improved considerably by applying Nearest Neighbour (NN) averaging, which reduces the error (1¿) up to a factor of 6. This technique was applied to displacements prior to the April 2007 rockfall event at Castellfollit de la Roca, Spain. The maximum precursory displacement measured was 45 mm, approximately 2.5 times the standard deviation of the model comparison, hampering the distinction between actual displacement and instrumental error using conventional methodologies. Encouragingly, the precursory displacement was clearly detected by applying the NN averaging method. These results show that millimetric displacements prior to failure can be detected using TLS.
Resumo:
This case study deals with a rock face monitoring in urban areas using a Terrestrial Laser Scanner. The pilot study area is an almost vertical, fifty meter high cliff, on top of which the village of Castellfollit de la Roca is located. Rockfall activity is currently causing a retreat of the rock face, which may endanger the houses located at its edge. TLS datasets consist of high density 3-D point clouds acquired from five stations, nine times in a time span of 22 months (from March 2006 to January 2008). The change detection, i.e. rockfalls, was performed through a sequential comparison of datasets. Two types of mass movement were detected in the monitoring period: (a) detachment of single basaltic columns, with magnitudes below 1.5 m3 and (b) detachment of groups of columns, with magnitudes of 1.5 to 150 m3. Furthermore, the historical record revealed (c) the occurrence of slab failures with magnitudes higher than 150 m3. Displacements of a likely slab failure were measured, suggesting an apparent stationary stage. Even failures are clearly episodic, our results, together with the study of the historical record, enabled us to estimate a mean detachment of material from 46 to 91.5 m3 year¿1. The application of TLS considerably improved our understanding of rockfall phenomena in the study area.
Resumo:
La importancia de mecanismos de regulación de los propios aprendizajes por parte de los estudiantes, así como de la calidad de la retroalimentación y de la evaluación formativa por parte de los profesores está en las agendas internacionales de la calidad de la enseñanza superior. El uso de herramientas de lectura, escritura e interacción web 2.0 se enmarca claramente en ese propósito general. El artículo ofrece resultados de una investigación-acción interuniversitaria sobre la evaluación de competencias con blogs. Más concretamente, presenta tres escenarios formativos en los que el blog se utiliza como herramienta de evaluación reflexiva; indaga el grado en que el alumnado se hace consciente de su proceso de aprendizaje, cómo lo evidencia y cómo lo autorregula; y, finalmente, analiza el tipo de feed-back utilizado por el profesorado. Se presentan resultados del análisis documental tanto de las producciones llevadas a cabo en los blogs como del feed-back facilitado por el profesorado. Los resultados muestran el peso que tiene la calidad del feed-back que facilita el profesorado en el proceso de acompañamiento de construcción de competencias.
Resumo:
Information about the genomic coordinates and the sequence of experimentally identified transcription factor binding sites is found scattered under a variety of diverse formats. The availability of standard collections of such high-quality data is important to design, evaluate and improve novel computational approaches to identify binding motifs on promoter sequences from related genes. ABS (http://genome.imim.es/datasets/abs2005/index.html) is a public database of known binding sites identified in promoters of orthologous vertebrate genes that have been manually curated from bibliography. We have annotated 650 experimental binding sites from 68 transcription factors and 100 orthologous target genes in human, mouse, rat or chicken genome sequences. Computational predictions and promoter alignment information are also provided for each entry. A simple and easy-to-use web interface facilitates data retrieval allowing different views of the information. In addition, the release 1.0 of ABS includes a customizable generator of artificial datasets based on the known sites contained in the collection and an evaluation tool to aid during the training and the assessment of motif-finding programs.
Resumo:
PURPOSE: Pharmacovigilance methods have advanced greatly during the last decades, making post-market drug assessment an essential drug evaluation component. These methods mainly rely on the use of spontaneous reporting systems and health information databases to collect expertise from huge amounts of real-world reports. The EU-ADR Web Platform was built to further facilitate accessing, monitoring and exploring these data, enabling an in-depth analysis of adverse drug reactions risks.METHODS: The EU-ADR Web Platform exploits the wealth of data collected within a large-scale European initiative, the EU-ADR project. Millions of electronic health records, provided by national health agencies, are mined for specific drug events, which are correlated with literature, protein and pathway data, resulting in a rich drug-event dataset. Next, advanced distributed computing methods are tailored to coordinate the execution of data-mining and statistical analysis tasks. This permits obtaining a ranked drug-event list, removing spurious entries and highlighting relationships with high risk potential.RESULTS: The EU-ADR Web Platform is an open workspace for the integrated analysis of pharmacovigilance datasets. Using this software, researchers can access a variety of tools provided by distinct partners in a single centralized environment. Besides performing standalone drug-event assessments, they can also control the pipeline for an improved batch analysis of custom datasets. Drug-event pairs can be substantiated and statistically analysed within the platform's innovative working environment.CONCLUSIONS: A pioneering workspace that helps in explaining the biological path of adverse drug reactions was developed within the EU-ADR project consortium. This tool, targeted at the pharmacovigilance community, is available online at https://bioinformatics.ua.pt/euadr/. Copyright © 2012 John Wiley & Sons, Ltd.
Resumo:
Proyecte de desarrollo de una aplicación web para poder leer noticias de un blog (Lector RSS)
Resumo:
Algunes de les investigacions desenvolupades en els últims temps fan cada cop més incidència en la importància de l’enfocament metodològic sota el qual s’integren les noves tecnologies a l’aula. L’estudi que es presenta a continuació parteix de l’anàlisi d’una experiència d’integració de les TIC en una proposta de projectes de treball global desenvolupada en dues aules d’educació infantil de l’escola la Sínia de Vic. El propòsit general del treball és mostrar quins són els efectes que aquestes tecnologies exerceixen sobre l’alumnat i el professorat implicat, així com la contribució d’un recurs com el bloc en la relació entre l’escola i la família, prenent com a fil conductor el projecte portat a terme. En aquest sentit, alguns dels resultats més rellevants de la investigació mostren com els rols desenvolupats per l’alumnat i el professorat de l’aula influeixen directament en les actituds i la motivació de l’alumnat envers les tasques, i finalment posen de manifest la importància dels plantejaments didàctics sota els quals s’utilitzen els recursos TIC a l’aula. D’altra banda, s’ha vist que el bloc no ha tingut les mateixes repercussions en totes les famílies per diversos motius particulars que requeriran d’un treball més a fons per potenciar-ho molt més entre els pares i mares.
Resumo:
A l’escola bressol el vincle entre equip educatiu i família és de gran importància i, en la societat canviant en la que vivim, és important cercar noves vies de comunicació. Aquesta recerca és un estudi de casos de l’ús de les TIC com a mitjà de comunicació entre escola i família a tres escoles bressol. Amb l’objectiu de descriure i analitzar aquestes eines s’utilitzen entrevistes, qüestionaris i anàlisi de documents com a instruments de recollida de dades. Els resultats mostren que les eines més utilitzades són el blog i el correu electrònic amb la funció d’informar a les famílies. Aquestes dades permeten identificar l’ús, la funció i els destinataris com a criteris necessaris per a escollir l’eina TIC més adient per a la relació família-escola d’un centre, i per elaborar una guia per a l’elecció d’eines TIC com a mitjà de relació entre els dos agents educatius.