926 resultados para visual data analysis
A robust Bayesian approach to null intercept measurement error model with application to dental data
Resumo:
Measurement error models often arise in epidemiological and clinical research. Usually, in this set up it is assumed that the latent variable has a normal distribution. However, the normality assumption may not be always correct. Skew-normal/independent distribution is a class of asymmetric thick-tailed distributions which includes the Skew-normal distribution as a special case. In this paper, we explore the use of skew-normal/independent distribution as a robust alternative to null intercept measurement error model under a Bayesian paradigm. We assume that the random errors and the unobserved value of the covariate (latent variable) follows jointly a skew-normal/independent distribution, providing an appealing robust alternative to the routine use of symmetric normal distribution in this type of model. Specific distributions examined include univariate and multivariate versions of the skew-normal distribution, the skew-t distributions, the skew-slash distributions and the skew contaminated normal distributions. The methods developed is illustrated using a real data set from a dental clinical trial. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Thermal analysis has been extensively used to obtain information about drug-polymer interactions and to perform pre-formulation studies of pharmaceutical dosage forms. In this work, biodegradable microparticles of poly(D,L-lactide-co-glycolide) (PLGA) containing ciprofloxacin hydrochloride (CP) in various drug:polymer ratios were obtained by spray drying. The main purpose of this study was to investigate the effect of the spray drying process on the drug-polymer interactions and on the stability of microparticles using differential scanning calorimetry (DSC), thermogravimetry (TG) and derivative thermogravimetry (DTG) and infrared spectroscopy (IR). The results showed that the high levels of encapsulation efficiency were dependant on drug:polymer ratio. DSC and TG/DTG analyses showed that for physical mixtures of the microparticles components the thermal profiles were different from those signals obtained with the pure substances. Thermal analysis data disclosed that physical interaction between CP and PLGA in high temperatures had occurred. The DSC and TG profiles for drug-loaded microparticles were very similar to the physical mixtures of components and it was possible to characterize the thermal properties of microparticles according to drug content. These data indicated that the spray dryer technique does not affect the physicochemical properties of the microparticles. In addition, the results are in agreement with IR data analysis demonstrating that no significant chemical interaction occurs between CP and PLGA in both physical mixtures and microparticles. In conclusion, we have found that the spray drying procedure used in this work can be a secure methodology to produce CP-loaded microparticles. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
The cost of a road construction over its service life is a function of the design, quality of construction, maintenance strategies and maintenance operations. Unfortunately, designers often neglect a very important aspect which is the possibility to perform future maintenance activities. The focus is mainly on other aspects such as investment costs, traffic safety, aesthetic appearance, regional development and environmental effects. This licentiate thesis is a part of a Ph.D. project entitled “Road Design for lower maintenance costs” that aims to examine how the life-cycle costs can be optimized by selection of appropriate geometrical designs for the roads and their components. The result is expected to give a basis for a new method used in the road planning and design process using life-cycle cost analysis with particular emphasis on road maintenance. The project started with a review of literature with the intention to study conditions causing increased needs for road maintenance, the efforts made by the road authorities to satisfy those needs and the improvement potential by consideration of maintenance aspects during planning and design. An investigation was carried out to identify the problems which obstruct due consideration of maintenance aspects during the road planning and design process. This investigation focused mainly on the road planning and design process at the Swedish Road Administration. However, the road planning and design process in Denmark, Finland and Norway were also roughly evaluated to gain a broader knowledge about the research subject. The investigation was carried out in two phases: data collection and data analysis. Data was collected by semi-structured interviews with expert actors involved in planning, design and maintenance and by a review of design-related documents. Data analyses were carried out using a method called “Change Analysis”. This investigation revealed a complex combination of problems which result in inadequate consideration of maintenance aspects. Several urgent needs for changes to eliminate these problems were identified. Another study was carried out to develop a model for calculation of the repair costs for damages of different road barrier types and to analyse how factors such as road type, speed limits, barrier types, barrier placement, type of road section, alignment and seasonal effects affect the barrier damages and the associated repair costs. This study was carried out using a method called the “Case Study Research Method”. Data was collected from 1087 barrier repairs in two regional offices of the Swedish Road Administration, the Central Region and the Western Region. A table was established for both regions containing the repair cost per vehicle kilometre for different combinations of barrier types, road types and speed limits. This table can be used by the designers in the calculation of the life-cycle costs for different road barrier types.
Resumo:
A challenge for the clinical management of Parkinson's disease (PD) is the large within- and between-patient variability in symptom profiles as well as the emergence of motor complications which represent a significant source of disability in patients. This thesis deals with the development and evaluation of methods and systems for supporting the management of PD by using repeated measures, consisting of subjective assessments of symptoms and objective assessments of motor function through fine motor tests (spirography and tapping), collected by means of a telemetry touch screen device. One aim of the thesis was to develop methods for objective quantification and analysis of the severity of motor impairments being represented in spiral drawings and tapping results. This was accomplished by first quantifying the digitized movement data with time series analysis and then using them in data-driven modelling for automating the process of assessment of symptom severity. The objective measures were then analysed with respect to subjective assessments of motor conditions. Another aim was to develop a method for providing comparable information content as clinical rating scales by combining subjective and objective measures into composite scores, using time series analysis and data-driven methods. The scores represent six symptom dimensions and an overall test score for reflecting the global health condition of the patient. In addition, the thesis presents the development of a web-based system for providing a visual representation of symptoms over time allowing clinicians to remotely monitor the symptom profiles of their patients. The quality of the methods was assessed by reporting different metrics of validity, reliability and sensitivity to treatment interventions and natural PD progression over time. Results from two studies demonstrated that the methods developed for the fine motor tests had good metrics indicating that they are appropriate to quantitatively and objectively assess the severity of motor impairments of PD patients. The fine motor tests captured different symptoms; spiral drawing impairment and tapping accuracy related to dyskinesias (involuntary movements) whereas tapping speed related to bradykinesia (slowness of movements). A longitudinal data analysis indicated that the six symptom dimensions and the overall test score contained important elements of information of the clinical scales and can be used to measure effects of PD treatment interventions and disease progression. A usability evaluation of the web-based system showed that the information presented in the system was comparable to qualitative clinical observations and the system was recognized as a tool that will assist in the management of patients.
Resumo:
The accurate measurement of a vehicle’s velocity is an essential feature in adaptive vehicle activated sign systems. Since the velocities of the vehicles are acquired from a continuous wave Doppler radar, the data collection becomes challenging. Data accuracy is sensitive to the calibration of the radar on the road. However, clear methodologies for in-field calibration have not been carefully established. The signs are often installed by subjective judgment which results in measurement errors. This paper develops a calibration method based on mining the data collected and matching individual vehicles travelling between two radars. The data was cleaned and prepared in two ways: cleaning and reconstructing. The results showed that the proposed correction factor derived from the cleaned data corresponded well with the experimental factor done on site. In addition, this proposed factor showed superior performance to the one derived from the reconstructed data.
Resumo:
Background. Through a national policy agreement, over 167 million Euros will be invested in the Swedish National Quality Registries (NQRs) between 2012 and 2016. One of the policy agreement¿s intentions is to increase the use of NQR data for quality improvement (QI). However, the evidence is fragmented as to how the use of medical registries and the like lead to quality improvement, and little is known about non-clinical use. The aim was therefore to investigate the perspectives of Swedish politicians and administrators on quality improvement based on national registry data. Methods. Politicians and administrators from four county councils were interviewed. A qualitative content analysis guided by the Consolidated Framework for Implementation Research (CFIR) was performed. Results. The politicians and administrators perspectives on the use of NQR data for quality improvement were mainly assigned to three of the five CFIR domains. In the domain of intervention characteristics, data reliability and access in reasonable time were not considered entirely satisfactory, making it difficult for the politico-administrative leaderships to initiate, monitor, and support timely QI efforts. Still, politicians and administrators trusted the idea of using the NQRs as a base for quality improvement. In the domain of inner setting, the organizational structures were not sufficiently developed to utilize the advantages of the NQRs, and readiness for implementation appeared to be inadequate for two reasons. Firstly, the resources for data analysis and quality improvement were not considered sufficient at politico-administrative or clinical level. Secondly, deficiencies in leadership engagement at multiple levels were described and there was a lack of consensus on the politicians¿ role and level of involvement. Regarding the domain of outer setting, there was a lack of communication and cooperation between the county councils and the national NQR organizations. Conclusions. The Swedish experiences show that a government-supported national system of well-funded, well-managed, and reputable national quality registries needs favorable local politico-administrative conditions to be used for quality improvement; such conditions are not yet in place according to local politicians and administrators.
Resumo:
Researchers analyzing spatiotemporal or panel data, which varies both in location and over time, often find that their data has holes or gaps. This thesis explores alternative methods for filling those gaps and also suggests a set of techniques for evaluating those gap-filling methods to determine which works best.
Resumo:
Instrumentation and automation plays a vital role to managing the water industry. These systems generate vast amounts of data that must be effectively managed in order to enable intelligent decision making. Time series data management software, commonly known as data historians are used for collecting and managing real-time (time series) information. More advanced software solutions provide a data infrastructure or utility wide Operations Data Management System (ODMS) that stores, manages, calculates, displays, shares, and integrates data from multiple disparate automation and business systems that are used daily in water utilities. These ODMS solutions are proven and have the ability to manage data from smart water meters to the collaboration of data across third party corporations. This paper focuses on practical, utility successes in the water industry where utility managers are leveraging instantaneous access to data from proven, commercial off-the-shelf ODMS solutions to enable better real-time decision making. Successes include saving $650,000 / year in water loss control, safeguarding water quality, saving millions of dollars in energy management and asset management. Immediate opportunities exist to integrate the research being done in academia with these ODMS solutions in the field and to leverage these successes to utilities around the world.
Resumo:
Existing distributed hydrologic models are complex and computationally demanding for using as a rapid-forecasting policy-decision tool, or even as a class-room educational tool. In addition, platform dependence, specific input/output data structures and non-dynamic data-interaction with pluggable software components inside the existing proprietary frameworks make these models restrictive only to the specialized user groups. RWater is a web-based hydrologic analysis and modeling framework that utilizes the commonly used R software within the HUBzero cyber infrastructure of Purdue University. RWater is designed as an integrated framework for distributed hydrologic simulation, along with subsequent parameter optimization and visualization schemes. RWater provides platform independent web-based interface, flexible data integration capacity, grid-based simulations, and user-extensibility. RWater uses RStudio to simulate hydrologic processes on raster based data obtained through conventional GIS pre-processing. The program integrates Shuffled Complex Evolution (SCE) algorithm for parameter optimization. Moreover, RWater enables users to produce different descriptive statistics and visualization of the outputs at different temporal resolutions. The applicability of RWater will be demonstrated by application on two watersheds in Indiana for multiple rainfall events.
Resumo:
This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials’ circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.
Resumo:
This article presents the data-rich findings of an experiment with enlisting patron-driven/demand-driven acquisitions (DDA) of ebooks in two ways. The first experiment entailed comparison of DDA eBook usage against newly ordered hardcopy materials’ circulation, both overall and ebook vs. print usage within the same subject areas. Secondly, this study experimented with DDA ebooks as a backup plan for unfunded requests left over at the end of the fiscal year.
Resumo:
Libraries seek active ways to innovate amidst macroeconomic shifts, growing online education to help alleviate ever-growing schedule conflicts as students juggle jobs and course schedules, as well as changing business models in publishing and evolving information technologies. Patron-driven acquisition (PDA), also known as demand-driven acquisition (DDA), offers numerous strengths in supporting university curricula in the context of these significant shifts. PDA is a business model centered on short-term loans and subsequent purchases of ebooks resulting directly from patrons' natural use stemming from their discovery of the ebooks in library catalogs where the ebooks' bibliographic records are loaded at regular intervals established between the library and ebook supplier. Winthrop University's PDA plan went live in October 2011, and this article chronicles the philosophical and operational considerations, the in-library collaboration, and technical preparations in concert with the library system vendor and ebook supplier. Short-term loan is invoked after a threshold is crossed, typically number of pages or time spent in the ebook. After a certain number of short-term loans negotiated between the library and ebook supplier, the next short-term loan becomes an automatic purchase after which the library owns the ebook in perpetuity. Purchasing options include single-user and multi-user licenses. Owing to high levels of need in college and university environments, Winthrop chose the multi-user license as the preferred default purchase. Only where multi-user licenses are unavailable does the automatic purchase occur with single-user title licenses. Data on initial use between October 2011 and February 2013 reveal that of all PDA ebooks viewed, only 30% crossed the threshold into short-term loans. Of all triggered short-term loans, Psychology was the highest-using. Of all ebook views too brief to trigger short-term loans, Business was the highest-using area. Although the data are still too young to draw conclusions after only a few months, thought-provoking usage differences between academic disciplines have begun to emerge. These differences should be considered in library plans for the best possible curricular support for each academic program. As higher education struggles with costs and course-delivery methods libraries have an enduring lead role.
Resumo:
This article describes analyzing Interlibrary Loan data to help inform collection management decision and offers guidance for formulating policies for discerning borrowed titles indicative of gaps in the library from special-interest pursuits beyond the scope of the university curriculum.
Resumo:
Excessive labor turnover may be considered, to a great extent, an undesirable feature of a given economy. This follows from considerations such as underinvestment in human capital by firms. Understanding the determinants and the evolution of turnover in a particular labor market is therefore of paramount importance, including policy considerations. The present paper proposes an econometric analysis of turnover in the Brazilian labor market, based on a partial observability bivariate probit model. This model considers the interdependence of decisions taken by workers and firms, helping to elucidate the causes that lead each of them to end an employment relationship. The Employment and Unemployment Survey (PED) conducted by the State System of Data Analysis (SEADE) and by the Inter-Union Department of Statistics and Socioeconomic Studies (DIEESE) provides data at the individual worker level, allowing for the estimation of the joint probabilities of decisions to quit or stay on the job on the worker’s side, and to maintain or fire the employee on the firm’s side, during a given time period. The estimated parameters relate these estimated probabilities to the characteristics of workers, job contracts, and to the potential macroeconomic determinants in different time periods. The results confirm the theoretical prediction that the probability of termination of an employment relationship tends to be smaller as the worker acquires specific skills. The results also show that the establishment of a formal employment relationship reduces the probability of a quit decision by the worker, and also the firm’s firing decision in non-industrial sectors. With regard to the evolution of quit probability over time, the results show that an increase in the unemployment rate inhibits quitting, although this tends to wane as the unemployment rate rises.
Resumo:
Private equity, ou o ato de fundos ou investidores de investir em empresas não cotadas em bolsa pública, assumiu uma importância crescente no mundo financeiro nos últimos anos. De fato, enquanto o surgimento de um setor de private equity (PE) tem sido um grande fenômeno em mercados emergentes desde meados dos anos 2000, a crise financeira mundial enfraqueceu private equity no mundo desenvolvido. Assim, esta pesquisa vai se concentrar em dois países com dinâmicas supostamente muito diferentes em relação a este sector: França e Brasil. O objetivo será o de discernir padrões gerais de comportamento em ambos os sectores de PE durante todo o período compreendido 2006-2013, e tentar determinar em que medida eles são comparáveis. Utilizando a literatura como fonte conceitual para o quadro comparativo a ser desenvolvido, será analisado se as condições do mercado e do ambiente institucional evoluíram durante o período estudado na França e no Brasil, se comparar, e se eles impactaram o nível de atividade de private equity - oferta e demanda de fundos - em ambos os países. Para identificar esses padrões, a pesquisa contará com uma análise de dados exploratória qualitativa, com base em um quadro dos determinantes do setor de PE identificados e retirados da literatura acadêmica. Esta pesquisa trazera sua contribuição para o trabalho acadêmico existente sobre private equity, graças à sua natureza comparativa e para a sua conclusão sobre a relevância dos determinantes acima mencionados sobre a atividade de private equity na França e no Brasil.