907 resultados para General Linear Methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We showed earlier how to predict the writhe of any rational knot or link in its ideal geometric configuration, or equivalently the average of the 3D writhe over statistical ensembles of random configurations of a given knot or link (Cerf and Stasiak 2000 Proc. Natl Acad. Sci. USA 97 3795). There is no general relation between the minimal crossing number of a knot and the writhe of its ideal geometric configuration. However, within individual families of knots linear relations between minimal crossing number and writhe were observed (Katritch et al 1996 Nature 384 142). Here we present a method that allows us to express the writhe as a linear function of the minimal crossing number within Conway families of knots and links in their ideal configuration. The slope of the lines and the shift between any two lines with the same

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This study aimed to assess the validity of COOP charts in a general population sample, to examine whether illustrations contribute to instrument validity, and to establish general population norms. METHODS: A general population mail survey was conducted among 20-79 years old residents of the Swiss canton of Vaud. Participants were invited to complete COOP charts, the SF-36 Health Survey; they also provided data on health service use in the previous month. Two thirds of the respondents received standard COOP charts, the rest received charts without illustrations. RESULTS: Overall 1250 persons responded (54%). The presence of illustrations did not affect score distributions, except that the illustrated 'physical fitness' chart drew greater non-response (10 vs. 3%, p < 0.001). Validity tests were similar for illustrated and picture-less charts. Factor analysis yielded two principal components, corresponding to physical and mental health. Six COOP charts showed strong and nearly linear relationships with corresponding SF36 scores (all p < 0.001), demonstrating concurrent validity. Similarly, most COOP charts were associated with the use of medical services in the past month. Only the chart on 'social support' partly deviated from construct validity hypotheses. Population norms revealed a generally lower health status in women and an age-related decline in physical health. CONCLUSIONS: COOP charts can be used to assess the health status of a general population. Their validity is good, with the possible exception of the 'social support' chart. The illustrations do not affect the properties of this instrument.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To determine the extent drinking patterns (at the individual and country level) are associated with alcohol-related consequences over and above the total alcohol the person consumes. METHODS: Hierarchical linear models were estimated based on general population surveys conducted in 18 countries participating in the GENACIS project. RESULTS: In general, the positive association between drinking pattern scores and alcohol-related consequences was found at both the individual and country levels, independent of volume of drinking. In addition, a significant interaction effect indicated that the more detrimental the country's drinking pattern, the less steep the association between the volume of drinking and its consequences. CONCLUSION: Drinking patterns have an independent impact on consequences over and above the relationship between volume and consequences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction The writing of prescriptions is an important aspect of medical practice. Since 2006, the Swiss authorities have decided to impose incentives to prescribe generic drugs. The objectives of this study were 1) to determine the evolution of the outpatient prescription practice in our paediatric university hospital during 2 periods separated by 5 years; 2) to assess the writing quality of outpatient prescriptions during the same period.Materials & Methods Design: Copies of prescriptions written by physicians were collected twice from community pharmacies in the region of our hospital for a 2-month period in 2005 and 2010. They were analysed according to standard criteria regarding both formal and pharmaceutical aspects. Drug prescriptions were classified as a) complete when all criteria for safety were fulfilled, b) ambiguous when there was a danger of a dispensing error because of one or more missing criteria, or c) containing an error.Setting: Paediatric university hospital.Main outcome measures: Proportion of generic drugs; outpatient prescription writing quality.Results: A total of 651 handwritten prescriptions were reviewed in 2005 and 693 in 2010. They contained 1570 drug prescriptions in 2005 (2.4 ± 1.2 drugs per patient) and 1462 in 2010 (2.1 ± 1.1). The most common drugs were paracetamol, ibuprofen, and sodium chloride. A higher proportion of drugs were prescribed as generic names or generics in 2010. Formal data regarding the physicians and the patients were almost complete, except for the patients' weight. Of the drug prescriptions, 48.5% were incomplete, 11.3% were ambiguous, and 3.0% contained an error in 2005. These proportions rose to 64.2%, 15.5% and 7.4% in 2010, respectively.Discussions, Conclusion This study showed that physicians' prescriptions comprised numerous omissions and errors with minimal potential for harm. Computerized prescription coupled with advanced decision support is eagerly awaited.Disclosure of Interest None Declared

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article describes the application of a recently developed general unknown screening (GUS) strategy based on LC coupled to a hybrid linear IT-triple quadrupole mass spectrometer (LC-MS/MS-LIT) for the simultaneous detection and identification of drug metabolites following in vitro incubation with human liver microsomes. The histamine H1 receptor antagonist loratadine was chosen as a model compound to demonstrate the interest of such approach, because of its previously described complex and extensive metabolism. Detection and mass spectral characterization were based on data-dependent acquisition, switching between a survey scan acquired in the ion-trapping Q3 scan mode with dynamic subtraction of background noise, and a dependent scan in the ion-trapping product ion scan mode of automatically selected parent ions. In addition, the MS(3) mode was used in a second step to confirm the structure of a few fragment ions. The sensitivity of the ion-trapping modes combined with the selectivity of the triple quadrupole modes allowed, with only one injection, the detection and identification of 17 phase I metabolites of loratadine. The GUS procedure used in this study may be applicable as a generic technique for the characterization of drug metabolites after in vitro incubation, as well as probably in vivo experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To determine if the results of resin-dentin microtensile bond strength (µTBS) is correlated with the outcome parameters of clinical studies on non-retentive Class V restorations. METHODS: Resin-dentin µTBS data were obtained from one test center; the in vitro tests were all performed by the same operator. The µTBS testing was performed 8h after bonding and after 6 months of storing the specimens in water. Pre-test failures (PTFs) of specimens were included in the analysis, attributing them a value of 1MPa. Prospective clinical studies on cervical restorations (Class V) with an observation period of at least 18 months were searched in the literature. The clinical outcome variables were retention loss, marginal discoloration and marginal integrity. Furthermore, an index was formulated to be better able to compare the laboratory and clinical results. Estimates of adhesive effects in a linear mixed model were used to summarize the clinical performance of each adhesive between 12 and 36 months. Spearman correlations between these clinical performances and the µTBS values were calculated subsequently. RESULTS: Thirty-six clinical studies with 15 adhesive/restorative systems for which µTBS data were also available were included in the statistical analysis. In general 3-step and 2-step etch-and-rinse systems showed higher bond strength values than the 2-step/3-step self-etching systems, which, however, produced higher values than the 1-step self-etching and the resin modified glass ionomer systems. Prolonged water storage of specimens resulted in a significant decrease of the mean bond strength values in 5 adhesive systems (Wilcoxon, p<0.05). There was a significant correlation between µTBS values both after 8h and 6 months of storage and marginal discoloration (r=0.54 and r=0.67, respectively). However, the same correlation was not found between µTBS values and the retention rate, clinical index or marginal integrity. SIGNIFICANCE: As µTBS data of adhesive systems, especially after water storage for 6 months, showed a good correlation with marginal discoloration in short-term clinical Class V restorations, longitudinal clinical trials should explore whether early marginal staining is predictive for future retention loss in non-carious cervical restorations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: Delirium is a highly prevalent disorder, with serious consequences for the hospitalised patient. Nevertheless, it remains under-diagnosed and under-treated. We developed evidence-based clinical practice guidelines (CPGs) focusing on prevention, screening, diagnosis, and treatment of delirium in a general hospital. This article presents the implementation process of these CPGs and a before-after study assessing their impact on healthcare professionals' knowledge and on clinical practice. METHODS: CPGs on delirium were first implemented in two wards (Neurology and Neurosurgery) of the Lausanne university hospital. Interactive one-hour educational sessions for small groups of nurses and physicians were organised. Participants received a summary of the guidelines and completed a multiple choice questionnaire, assessing putative changes in knowledge, before and three months after the educational session. Other indicators such as "diagnosis of delirium" reported in the discharge letters, and mean duration of patients' hospital stay before and after implementation were compared. RESULTS: Eighty percent of the nurses and physicians from the Neurology and Neurosurgery wards attended the educational sessions. Both nurses and physicians significantly improved their knowledge after the implementation (+9 percentage-points). Other indicators were not modified by the intervention. CONCLUSION: A single interactive intervention improved both nurses' and physicians' knowledge on delirium. Sustained and repeated interventions are probably needed to demonstrate changes in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of this work were to estimate the genetic and phenotypic parameters and to predict the genetic and genotypic values of the selection candidates obtained from intraspecific crosses in Panicum maximum as well as the performance of the hybrid progeny of the existing and projected crosses. Seventy-nine intraspecific hybrids obtained from artificial crosses among five apomictic and three sexual autotetraploid individuals were evaluated in a clonal test with two replications and ten plants per plot. Green matter yield, total and leaf dry matter yields and leaf percentage were evaluated in five cuts per year during three years. Genetic parameters were estimated and breeding and genotypic values were predicted using the restricted maximum likelihood/best linear unbiased prediction procedure (REML/BLUP). The dominant genetic variance was estimated by adjusting the effect of full-sib families. Low magnitude individual narrow sense heritabilities (0.02-0.05), individual broad sense heritabilities (0.14-0.20) and repeatability measured on an individual basis (0.15-0.21) were obtained. Dominance effects for all evaluated characteristics indicated that breeding strategies that explore heterosis must be adopted. Less than 5% increase in the parameter repeatability was obtained for a three-year evaluation period and may be the criterion to determine the maximum number of years of evaluation to be adopted, without compromising gain per cycle of selection. The identification of hybrid candidates for future cultivars and of those that can be incorporated into the breeding program was based on the genotypic and breeding values, respectively. The prediction of the performance of the hybrid progeny, based on the breeding values of the progenitors, permitted the identification of the best crosses and indicated the best parents to use in crosses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polynomial constraint solving plays a prominent role in several areas of hardware and software analysis and verification, e.g., termination proving, program invariant generation and hybrid system verification, to name a few. In this paper we propose a new method for solving non-linear constraints based on encoding the problem into an SMT problem considering only linear arithmetic. Unlike other existing methods, our method focuses on proving satisfiability of the constraints rather than on proving unsatisfiability, which is more relevant in several applications as we illustrate with several examples. Nevertheless, we also present new techniques based on the analysis of unsatisfiable cores that allow one to efficiently prove unsatisfiability too for a broad class of problems. The power of our approach is demonstrated by means of extensive experiments comparing our prototype with state-of-the-art tools on benchmarks taken both from the academic and the industrial world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Blowing and drifting of snow is a major concern for transportation efficiency and road safety in regions where their development is common. One common way to mitigate snow drift on roadways is to install plastic snow fences. Correct design of snow fences is critical for road safety and maintaining the roads open during winter in the US Midwest and other states affected by large snow events during the winter season and to maintain costs related to accumulation of snow on the roads and repair of roads to minimum levels. Of critical importance for road safety is the protection against snow drifting in regions with narrow rights of way, where standard fences cannot be deployed at the recommended distance from the road. Designing snow fences requires sound engineering judgment and a thorough evaluation of the potential for snow blowing and drifting at the construction site. The evaluation includes site-specific design parameters typically obtained with semi-empirical relations characterizing the local transport conditions. Among the critical parameters involved in fence design and assessment of their post-construction efficiency is the quantification of the snow accumulation at fence sites. The present study proposes a joint experimental and numerical approach to monitor snow deposits around snow fences, quantitatively estimate snow deposits in the field, asses the efficiency and improve the design of snow fences. Snow deposit profiles were mapped using GPS based real-time kinematic surveys (RTK) conducted at the monitored field site during and after snow storms. The monitored site allowed testing different snow fence designs under close to identical conditions over four winter seasons. The study also discusses the detailed monitoring system and analysis of weather forecast and meteorological conditions at the monitored sites. A main goal of the present study was to assess the performance of lightweight plastic snow fences with a lower porosity than the typical 50% porosity used in standard designs of such fences. The field data collected during the first winter was used to identify the best design for snow fences with a porosity of 50%. Flow fields obtained from numerical simulations showed that the fence design that worked the best during the first winter induced the formation of an elongated area of small velocity magnitude close to the ground. This information was used to identify other candidates for optimum design of fences with a lower porosity. Two of the designs with a fence porosity of 30% that were found to perform well based on results of numerical simulations were tested in the field during the second winter along with the best performing design for fences with a porosity of 50%. Field data showed that the length of the snow deposit away from the fence was reduced by about 30% for the two proposed lower-porosity (30%) fence designs compared to the best design identified for fences with a porosity of 50%. Moreover, one of the lower-porosity designs tested in the field showed no significant snow deposition within the bottom gap region beneath the fence. Thus, a major outcome of this study is to recommend using plastic snow fences with a porosity of 30%. It is expected that this lower-porosity design will continue to work well for even more severe snow events or for successive snow events occurring during the same winter. The approach advocated in the present study allowed making general recommendations for optimizing the design of lower-porosity plastic snow fences. This approach can be extended to improve the design of other types of snow fences. Some preliminary work for living snow fences is also discussed. Another major contribution of this study is to propose, develop protocols and test a novel technique based on close range photogrammetry (CRP) to quantify the snow deposits trapped snow fences. As image data can be acquired continuously, the time evolution of the volume of snow retained by a snow fence during a storm or during a whole winter season can, in principle, be obtained. Moreover, CRP is a non-intrusive method that eliminates the need to perform man-made measurements during the storms, which are difficult and sometimes dangerous to perform. Presently, there is lots of empiricism in the design of snow fences due to lack of data on fence storage capacity on how snow deposits change with the fence design and snow storm characteristics and in the estimation of the main parameters used by the state DOTs to design snow fences at a given site. The availability of such information from CRP measurements should provide critical data for the evaluation of the performance of a certain snow fence design that is tested by the IDOT. As part of the present study, the novel CRP method is tested at several sites. The present study also discusses some attempts and preliminary work to determine the snow relocation coefficient which is one of the main variables that has to be estimated by IDOT engineers when using the standard snow fence design software (Snow Drift Profiler, Tabler, 2006). Our analysis showed that standard empirical formulas did not produce reasonable values when applied at the Iowa test sites monitored as part of the present study and that simple methods to estimate this variable are not reliable. The present study makes recommendations for the development of a new methodology based on Large Scale Particle Image Velocimetry that can directly measure the snow drift fluxes and the amount of snow relocated by the fence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper develops an approach to rank testing that nests all existing rank tests andsimplifies their asymptotics. The approach is based on the fact that implicit in every ranktest there are estimators of the null spaces of the matrix in question. The approach yieldsmany new insights about the behavior of rank testing statistics under the null as well as localand global alternatives in both the standard and the cointegration setting. The approach alsosuggests many new rank tests based on alternative estimates of the null spaces as well as thenew fixed-b theory. A brief Monte Carlo study illustrates the results.