980 resultados para Computer Oriented Statistics


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite a trend of decreasing teen fatalities due to motor vehicle crashes over the past decade, they remain the leading cause of adolescent fatalities in Iowa. The purpose of this study was to create detailed case studies of each fatal motor vehicle crash involving a driver under the age of 20 that occurred in Iowa in 2009, 2010, and 2011. Data for each crash were gathered from media sources, law enforcement agencies, and the Iowa Department of Transportation. The driving records of the teens, which included their licensure history, prior traffic citations, and prior crashes, were also acquired. In addition, data about the charges filed against a teen as a result of being involved in a fatal crash were obtained. A total of 126 crashes involving 131 teen drivers that resulted in 143 fatalities were analyzed. Many findings for fatal crashes involving teen drivers in Iowa are consistent with national trends, including the overrepresentation of male drivers, crash involvement that increases with age, crash involvement per vehicle miles traveled that decreases with age, and prevalence of single-vehicle road departure crashes. Relative to national statistics, teen fatalities from crashes in Iowa are more likely to occur from midnight to 6am and from 9am to noon. Crash type varied by driver age and county population level. Teen drivers contributed to the fatal crashes at a rate of 74%; contribution of the teen driver was unknown for 11% of crashes. Speed was a factor for about 25% of the crashes for which a teen driver was at fault. The same was also true of alcohol/drug impairment. Only 20% of the rear-seat occupants of the teen drivers’ vehicles wore seat belts compared to 60% use for the front-seat occupants. Analysis of the teens’ driving records prior to the fatal crash suggests at-fault crashes and speeding violations are associated with contributing to the fatal crash.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Radioactive soil-contamination mapping and risk assessment is a vital issue for decision makers. Traditional approaches for mapping the spatial concentration of radionuclides employ various regression-based models, which usually provide a single-value prediction realization accompanied (in some cases) by estimation error. Such approaches do not provide the capability for rigorous uncertainty quantification or probabilistic mapping. Machine learning is a recent and fast-developing approach based on learning patterns and information from data. Artificial neural networks for prediction mapping have been especially powerful in combination with spatial statistics. A data-driven approach provides the opportunity to integrate additional relevant information about spatial phenomena into a prediction model for more accurate spatial estimates and associated uncertainty. Machine-learning algorithms can also be used for a wider spectrum of problems than before: classification, probability density estimation, and so forth. Stochastic simulations are used to model spatial variability and uncertainty. Unlike regression models, they provide multiple realizations of a particular spatial pattern that allow uncertainty and risk quantification. This paper reviews the most recent methods of spatial data analysis, prediction, and risk mapping, based on machine learning and stochastic simulations in comparison with more traditional regression models. The radioactive fallout from the Chernobyl Nuclear Power Plant accident is used to illustrate the application of the models for prediction and classification problems. This fallout is a unique case study that provides the challenging task of analyzing huge amounts of data ('hard' direct measurements, as well as supplementary information and expert estimates) and solving particular decision-oriented problems.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The general public seems to be convinced that juvenile delinquency has massively increased over the last decades. However, this assumption is much less popular among academics and some media where doubts about the reality of this trend are often expressed. In the present paper, trends are followed using conviction statistics over 50 years, police and victimization data since the 1980s, and self-report data collected since 1992. All sources consistently point to a massive increase of offending among juveniles, particularly for violent offences during the 1990s. Given that trends were similar in most European countries, explanations should be sought at the European rather than the national level. The available evidence points to possible effects of increased opportunities for property offences since 1950, and although causality remains hard to prove, effects of increased exposure to extreme media violence since 1985.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reliable and objective assessment of chronic disease state has been and still is a very significant challenge in clinical medicine. An essential feature of human behavior related to the health status, the functional capacity, and the quality of life is the physical activity during daily life. A common way to assess physical activity is to measure the quantity of body movement. Since human activity is controlled by various factors both extrinsic and intrinsic to the body, quantitative parameters only provide a partial assessment and do not allow for a clear distinction between normal and abnormal activity. In this paper, we propose a methodology for the analysis of human activity pattern based on the definition of different physical activity time series with the appropriate analysis methods. The temporal pattern of postures, movements, and transitions between postures was quantified using fractal analysis and symbolic dynamics statistics. The derived nonlinear metrics were able to discriminate patterns of daily activity generated from healthy and chronic pain states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The information in this digest comes from the FY11 Iowa Annual Public Library Survey. It reflects the activities of 525 of the 543 public libraries in Iowa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 2010-2011 (FY11) edition of Iowa Public Library Statistics includes information on income, expenditures, collections, circulation, and other measures, including staff. Each section is arranged by size code, then alphabetically by city. The totals and percentiles for each size code grouping are given immediately following the alphabetical listings. Totals and medians for all reporting libraries are given at the end of each section. There are 543 libraries included in this publication; 525 submitted a report. The table of size codes (page 5) lists the libraries alphabetically. The following table lists the size code designations, the population range in each size code, the number of libraries reporting in each size code, and the total population of the reporting libraries in each size code. The total population served by the 543 libraries is 2,339,070. Population data is used to determine per capita figures throughout the publication.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.