859 resultados para WORK METHODS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Ability to work and live independently is of particular concern for patients with Parkinson's disease (PD). We studied a series of PD patients able to work or live independently at baseline, and evaluated potential risk factors for two separate outcomes: loss of ability to work and loss of ability to live independently. METHODS: The series comprised 495 PD patients followed prospectively. Ability to work and ability to live independently were based on clinical interview and examination. Cox regression models adjusted for age and disease duration were used to evaluate associations of baseline characteristics with loss of ability to work and loss of ability to live independently. RESULTS: Higher UPDRS dyskinesia score, UPDRS instability score, UPDRS total score, Hoehn and Yahr stage, and presence of intellectual impairment at baseline were all associated with increased risk of future loss of ability to work and loss of ability to live independently (P ≤ 0.0033). Five years after initial visit, for patients ≤70 years of age with a disease duration ≤4 years at initial visit, 88% were still able to work and 90% to live independently. These estimates worsened as age and disease duration at initial visit increased; for patients >70 years of age with a disease duration >4 years, estimates at 5 years were 43% able to work and 57% able to live independently. CONCLUSIONS: The information provided in this study can offer useful information for PD patients in preparing for future ability to perform activities of daily living.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"Most quantitative empirical analyses are motivated by the desire to estimate the causal effect of an independent variable on a dependent variable. Although the randomized experiment is the most powerful design for this task, in most social science research done outside of psychology, experimental designs are infeasible. (Winship & Morgan, 1999, p. 659)." This quote from earlier work by Winship and Morgan, which was instrumental in setting the groundwork for their book, captures the essence of our review of Morgan and Winship's book: It is about causality in nonexperimental settings.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Euclidean distance matrix analysis (EDMA) methods are used to distinguish whether or not significant difference exists between conformational samples of antibody complementarity determining region (CDR) loops, isolated LI loop and LI in three-loop assembly (LI, L3 and H3) obtained from Monte Carlo simulation. After the significant difference is detected, the specific inter-Ca distance which contributes to the difference is identified using EDMA.The estimated and improved mean forms of the conformational samples of isolated LI loop and LI loop in three-loop assembly, CDR loops of antibody binding site, are described using EDMA and distance geometry (DGEOM). To the best of our knowledge, it is the first time the EDMA methods are used to analyze conformational samples of molecules obtained from Monte Carlo simulations. Therefore, validations of the EDMA methods using both positive control and negative control tests for the conformational samples of isolated LI loop and LI in three-loop assembly must be done. The EDMA-I bootstrap null hypothesis tests showed false positive results for the comparison of six samples of the isolated LI loop and true positive results for comparison of conformational samples of isolated LI loop and LI in three-loop assembly. The bootstrap confidence interval tests revealed true negative results for comparisons of six samples of the isolated LI loop, and false negative results for the conformational comparisons between isolated LI loop and LI in three-loop assembly. Different conformational sample sizes are further explored by combining the samples of isolated LI loop to increase the sample size, or by clustering the sample using self-organizing map (SOM) to narrow the conformational distribution of the samples being comparedmolecular conformations. However, there is no improvement made for both bootstrap null hypothesis and confidence interval tests. These results show that more work is required before EDMA methods can be used reliably as a method for comparison of samples obtained by Monte Carlo simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

New density functionals representing the exchange and correlation energies (per electron) are employed, based on the electron gas model, to calculate interaction potentials of noble gas systems X2 and XY, where X (and Y) are He,Ne,Ar and Kr, and of hydrogen atomrare gas systems H-X. The exchange energy density functional is that recommended by Handler and the correlation energy density functional is a rational function involving two parameters which were optimized to reproduce the correlation energy of He atom. Application of the two parameter function to other rare gas atoms shows that it is "universal"; i. e. ,accurate for the systems considered. The potentials obtained in this work compare well with recent experimental results and are a significant improvement over those from competing statistical modelS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methods of measuring specific heats of small samples were studied. Three automated methods were explored, two of which have shown promising results. The adiabatic continuous heating method, has provided smooth well behaved data but further work is presently underway to improve on the results obtained so far . The decay method has been success fully implemented demonstrating reasonable agreement with accepted data for a copper test sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work investigates mathematical details and computational aspects of Metropolis-Hastings reptation quantum Monte Carlo and its variants, in addition to the Bounce method and its variants. The issues that concern us include the sensitivity of these algorithms' target densities to the position of the trial electron density along the reptile, time-reversal symmetry of the propagators, and the length of the reptile. We calculate the ground-state energy and one-electron properties of LiH at its equilibrium geometry for all these algorithms. The importance sampling is performed with a single-determinant large Slater-type orbitals (STO) basis set. The computer codes were written to exploit the efficiencies engineered into modern, high-performance computing software. Using the Bounce method in the calculation of non-energy-related properties, those represented by operators that do not commute with the Hamiltonian, is a novel work. We found that the unmodified Bounce gives good ground state energy and very good one-electron properties. We attribute this to its favourable time-reversal symmetry in its target density's Green's functions. Breaking this symmetry gives poorer results. Use of a short reptile in the Bounce method does not alter the quality of the results. This suggests that in future applications one can use a shorter reptile to cut down the computational time dramatically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’objectif principal de cette thèse est d’explorer et d’analyser la réception de l’œuvre d’Eugen Wüster afin d’expliquer comment ses travaux ont influencé le développement disciplinaire de la terminologie. Du point de vue historique, les travaux de Wüster, en particulier la Théorie générale de la terminologie, ont stimulé la recherche en terminologie. Malgré des opinions divergentes, on s’entend pour reconnaître que les travaux de Wüster constituent la pierre angulaire de la terminologie moderne. Notre recherche vise spécifiquement à explorer la réception de l’œuvre wüsterienne en étudiant les écrits relatifs à cette œuvre dans la littérature universitaire en anglais, en espagnol et en français entre 1979 et 2009, en Europe et en Amérique. Réalisée dans le cadre du débat sur la réception de l’œuvre de Wüster, cette étude se concentre exclusivement sur l’analyse des critiques et des commentaires de son œuvre. Pour ce faire, nous avons tenu compte de la production intellectuelle de Wüster, de sa réception positive ou négative, des nouvelles approches théoriques en terminologie ainsi que des études portant sur l’état de la question en terminologie entre 1979 et 2009. Au moyen d’une recherche qualitative de type exploratoire, nous avons analysé un corpus de textes dans lesquels les auteurs : a. ont cité textuellement au moins un extrait d’un texte écrit par Wüster ; b. ont référé aux travaux de Wüster dans la bibliographie de l’article ; ou c. ont fait un commentaire sur ces travaux. De cette manière, nous avons cerné les grandes lignes du débat autour de la réception de son œuvre. Les résultats de notre étude sont éloquents. Ils offrent une idée claire de la réception des travaux de Wüster dans la communauté scientifique. Premièrement, Wüster représente une figure centrale de la terminologie moderne en ce qui concerne la normalisation terminologique. Il fut le premier à proposer une théorie de la terminologie. Deuxièmement, la contextualisation appropriée de son œuvre constitue un point de départ essentiel pour une appréciation éclairée et juste de sa contribution à l’évolution de la discipline. Troisièmement, les résultats de notre recherche dévoilent comment les nouvelles approches théoriques de la terminologie se sont adaptées aux progrès scientifiques et techniques. Quatrièmement, une étude menée sur 166 articles publiés dans des revues savantes confirme que l’œuvre de Wüster a provoqué des réactions variées tant en Europe qu’en Amérique et que sa réception est plutôt positive. Les résultats de notre étude font état d’une tendance qu’ont les auteurs de critiquer les travaux de Wüster avec lesquels, dans la plupart des cas, ils ne semblent cependant pas être bien familiarisés. La « méthodologie des programmes de recherche scientifique », proposée par Lakatos (1978) et appliquée comme un modèle interprétatif, nous a permis de démontrer que Wüster a joué un rôle décisif dans le développement de la terminologie comme discipline et que la terminologie peut être perçue comme un programme de recherche scientifique. La conclusion principale de notre thèse est que la terminologie a vécu des changements considérables et progressifs qui l’ont aidée à devenir, en termes lakatosiens, une discipline forte tant au plan théorique que descriptif.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Le problème de tournées de véhicules (VRP), introduit par Dantzig and Ramser en 1959, est devenu l'un des problèmes les plus étudiés en recherche opérationnelle, et ce, en raison de son intérêt méthodologique et de ses retombées pratiques dans de nombreux domaines tels que le transport, la logistique, les télécommunications et la production. L'objectif général du VRP est d'optimiser l'utilisation des ressources de transport afin de répondre aux besoins des clients tout en respectant les contraintes découlant des exigences du contexte d’application. Les applications réelles du VRP doivent tenir compte d’une grande variété de contraintes et plus ces contraintes sont nombreuse, plus le problème est difficile à résoudre. Les VRPs qui tiennent compte de l’ensemble de ces contraintes rencontrées en pratique et qui se rapprochent des applications réelles forment la classe des problèmes ‘riches’ de tournées de véhicules. Résoudre ces problèmes de manière efficiente pose des défis considérables pour la communauté de chercheurs qui se penchent sur les VRPs. Cette thèse, composée de deux parties, explore certaines extensions du VRP vers ces problèmes. La première partie de cette thèse porte sur le VRP périodique avec des contraintes de fenêtres de temps (PVRPTW). Celui-ci est une extension du VRP classique avec fenêtres de temps (VRPTW) puisqu’il considère un horizon de planification de plusieurs jours pendant lesquels les clients n'ont généralement pas besoin d’être desservi à tous les jours, mais plutôt peuvent être visités selon un certain nombre de combinaisons possibles de jours de livraison. Cette généralisation étend l'éventail d'applications de ce problème à diverses activités de distributions commerciales, telle la collecte des déchets, le balayage des rues, la distribution de produits alimentaires, la livraison du courrier, etc. La principale contribution scientifique de la première partie de cette thèse est le développement d'une méta-heuristique hybride dans la quelle un ensemble de procédures de recherche locales et de méta-heuristiques basées sur les principes de voisinages coopèrent avec un algorithme génétique afin d’améliorer la qualité des solutions et de promouvoir la diversité de la population. Les résultats obtenus montrent que la méthode proposée est très performante et donne de nouvelles meilleures solutions pour certains grands exemplaires du problème. La deuxième partie de cette étude a pour but de présenter, modéliser et résoudre deux problèmes riches de tournées de véhicules, qui sont des extensions du VRPTW en ce sens qu'ils incluent des demandes dépendantes du temps de ramassage et de livraison avec des restrictions au niveau de la synchronization temporelle. Ces problèmes sont connus respectivement sous le nom de Time-dependent Multi-zone Multi-Trip Vehicle Routing Problem with Time Windows (TMZT-VRPTW) et de Multi-zone Mult-Trip Pickup and Delivery Problem with Time Windows and Synchronization (MZT-PDTWS). Ces deux problèmes proviennent de la planification des opérations de systèmes logistiques urbains à deux niveaux. La difficulté de ces problèmes réside dans la manipulation de deux ensembles entrelacés de décisions: la composante des tournées de véhicules qui vise à déterminer les séquences de clients visités par chaque véhicule, et la composante de planification qui vise à faciliter l'arrivée des véhicules selon des restrictions au niveau de la synchronisation temporelle. Auparavant, ces questions ont été abordées séparément. La combinaison de ces types de décisions dans une seule formulation mathématique et dans une même méthode de résolution devrait donc donner de meilleurs résultats que de considérer ces décisions séparément. Dans cette étude, nous proposons des solutions heuristiques qui tiennent compte de ces deux types de décisions simultanément, et ce, d'une manière complète et efficace. Les résultats de tests expérimentaux confirment la performance de la méthode proposée lorsqu’on la compare aux autres méthodes présentées dans la littérature. En effet, la méthode développée propose des solutions nécessitant moins de véhicules et engendrant de moindres frais de déplacement pour effectuer efficacement la même quantité de travail. Dans le contexte des systèmes logistiques urbains, nos résultats impliquent une réduction de la présence de véhicules dans les rues de la ville et, par conséquent, de leur impact négatif sur la congestion et sur l’environnement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Literature on scoliosis screening is vast, however because of the observational nature of available data and methodological flaws, data interpretation is often complex, leading to incomplete and sometimes, somewhat misleading conclusions. The need to propose a set of methods for critical appraisal of the literature about scoliosis screening, a comprehensive summary and rating of the available evidence appeared essential. METHODS: To address these gaps, the study aims were: i) To propose a framework for the assessment of published studies on scoliosis screening effectiveness; ii) To suggest specific questions to be answered on screening effectiveness instead of trying to reach a global position for or against the programs; iii) To contextualize the knowledge through expert panel consultation and meaningful recommendations. The general methodological approach proceeds through the following steps: Elaboration of the conceptual framework; Formulation of the review questions; Identification of the criteria for the review; Selection of the studies; Critical assessment of the studies; Results synthesis; Formulation and grading of recommendations in response to the questions. This plan follows at best GRADE Group (Grades of Recommendation, Assessment, Development and Evaluation) requirements for systematic reviews, assessing quality of evidence and grading the strength of recommendations. CONCLUSIONS: In this article, the methods developed in support of this work are presented since they may be of some interest for similar reviews in scoliosis and orthopaedic fields.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of Queueing theory in areas like Computer networking, ATM facilities, Telecommunications and to many other numerous situation made people study Queueing models extensively and it has become an ever expanding branch of applied probability. The thesis discusses Reliability of a ‘k-out-of-n system’ where the server also attends external customers when there are no failed components (main customers), under a retrial policy, which can be explained in detail. It explains the reliability of a ‘K-out-of-n-system’ where the server also attends external customers and studies a multi-server infinite capacity Queueing system where each customer arrives as ordinary but can generate into priority customer which waiting in the queue. The study gives details on a finite capacity multi-server queueing system with self-generation of priority customers and also on a single server infinite capacity retrial Queue where the customer in the orbit can generate into a priority customer and leaves the system if the server is already busy with a priority generated customer; else he is taken for service immediately. Arrival process is according to a MAP and service times follow MSP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The preceding discussion and review of literature show that studies on gear selectivity have received great attention, while gear efficiency studies do not seem to have received equal consideration. In temperate waters, fishing industry is well organised and relatively large and well equipped vessels and gear are used for commercial fishing and the number of species are less; whereas in tropics particularly in India, small scale fishery dominates the scene and the fishery is multispecies operated upon by nmltigear. Therefore many of the problems faced in India may not exist in developed countries. Perhaps this would be the reason for the paucity of literature on the problems in estimation of relative efficiency. Much work has been carried out in estimating relative efficiency (Pycha, 1962; Pope, 1963; Gulland, 1967; Dickson, 1971 and Collins, 1979). The main subject of interest in the present thesis is an investigation into the problems in the comparison of fishing gears. especially in using classical test procedures with special reference to the prevailing fishing practices (that is. with reference to the catch data generated by the existing system). This has been taken up with a view to standardizing an approach for comparing the efficiency of fishing gear. Besides this, the implications of the terms ‘gear efficiency‘ and ‘gear selectivity‘ have been examined and based on the commonly used selectivity model (Holt, 1963), estimation of the ratio of fishing power of two gear has been considered. An attempt to determine the size of fish for which a gear is most efficient.has also been made. The work has been presented in eight chapters

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work is intended to study the following important aspects of document image processing and develop new methods. (1) Segmentation ofdocument images using adaptive interval valued neuro-fuzzy method. (2) Improving the segmentation procedure using Simulated Annealing technique. (3) Development of optimized compression algorithms using Genetic Algorithm and parallel Genetic Algorithm (4) Feature extraction of document images (5) Development of IV fuzzy rules. This work also helps for feature extraction and foreground and background identification. The proposed work incorporates Evolutionary and hybrid methods for segmentation and compression of document images. A study of different neural networks used in image processing, the study of developments in the area of fuzzy logic etc is carried out in this work

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Study on variable stars is an important topic of modern astrophysics. After the invention of powerful telescopes and high resolving powered CCD’s, the variable star data is accumulating in the order of peta-bytes. The huge amount of data need lot of automated methods as well as human experts. This thesis is devoted to the data analysis on variable star’s astronomical time series data and hence belong to the inter-disciplinary topic, Astrostatistics. For an observer on earth, stars that have a change in apparent brightness over time are called variable stars. The variation in brightness may be regular (periodic), quasi periodic (semi-periodic) or irregular manner (aperiodic) and are caused by various reasons. In some cases, the variation is due to some internal thermo-nuclear processes, which are generally known as intrinsic vari- ables and in some other cases, it is due to some external processes, like eclipse or rotation, which are known as extrinsic variables. Intrinsic variables can be further grouped into pulsating variables, eruptive variables and flare stars. Extrinsic variables are grouped into eclipsing binary stars and chromospheri- cal stars. Pulsating variables can again classified into Cepheid, RR Lyrae, RV Tauri, Delta Scuti, Mira etc. The eruptive or cataclysmic variables are novae, supernovae, etc., which rarely occurs and are not periodic phenomena. Most of the other variations are periodic in nature. Variable stars can be observed through many ways such as photometry, spectrophotometry and spectroscopy. The sequence of photometric observa- xiv tions on variable stars produces time series data, which contains time, magni- tude and error. The plot between variable star’s apparent magnitude and time are known as light curve. If the time series data is folded on a period, the plot between apparent magnitude and phase is known as phased light curve. The unique shape of phased light curve is a characteristic of each type of variable star. One way to identify the type of variable star and to classify them is by visually looking at the phased light curve by an expert. For last several years, automated algorithms are used to classify a group of variable stars, with the help of computers. Research on variable stars can be divided into different stages like observa- tion, data reduction, data analysis, modeling and classification. The modeling on variable stars helps to determine the short-term and long-term behaviour and to construct theoretical models (for eg:- Wilson-Devinney model for eclips- ing binaries) and to derive stellar properties like mass, radius, luminosity, tem- perature, internal and external structure, chemical composition and evolution. The classification requires the determination of the basic parameters like pe- riod, amplitude and phase and also some other derived parameters. Out of these, period is the most important parameter since the wrong periods can lead to sparse light curves and misleading information. Time series analysis is a method of applying mathematical and statistical tests to data, to quantify the variation, understand the nature of time-varying phenomena, to gain physical understanding of the system and to predict future behavior of the system. Astronomical time series usually suffer from unevenly spaced time instants, varying error conditions and possibility of big gaps. This is due to daily varying daylight and the weather conditions for ground based observations and observations from space may suffer from the impact of cosmic ray particles. Many large scale astronomical surveys such as MACHO, OGLE, EROS, xv ROTSE, PLANET, Hipparcos, MISAO, NSVS, ASAS, Pan-STARRS, Ke- pler,ESA, Gaia, LSST, CRTS provide variable star’s time series data, even though their primary intention is not variable star observation. Center for Astrostatistics, Pennsylvania State University is established to help the astro- nomical community with the aid of statistical tools for harvesting and analysing archival data. Most of these surveys releases the data to the public for further analysis. There exist many period search algorithms through astronomical time se- ries analysis, which can be classified into parametric (assume some underlying distribution for data) and non-parametric (do not assume any statistical model like Gaussian etc.,) methods. Many of the parametric methods are based on variations of discrete Fourier transforms like Generalised Lomb-Scargle peri- odogram (GLSP) by Zechmeister(2009), Significant Spectrum (SigSpec) by Reegen(2007) etc. Non-parametric methods include Phase Dispersion Minimi- sation (PDM) by Stellingwerf(1978) and Cubic spline method by Akerlof(1994) etc. Even though most of the methods can be brought under automation, any of the method stated above could not fully recover the true periods. The wrong detection of period can be due to several reasons such as power leakage to other frequencies which is due to finite total interval, finite sampling interval and finite amount of data. Another problem is aliasing, which is due to the influence of regular sampling. Also spurious periods appear due to long gaps and power flow to harmonic frequencies is an inherent problem of Fourier methods. Hence obtaining the exact period of variable star from it’s time series data is still a difficult problem, in case of huge databases, when subjected to automation. As Matthew Templeton, AAVSO, states “Variable star data analysis is not always straightforward; large-scale, automated analysis design is non-trivial”. Derekas et al. 2007, Deb et.al. 2010 states “The processing of xvi huge amount of data in these databases is quite challenging, even when looking at seemingly small issues such as period determination and classification”. It will be beneficial for the variable star astronomical community, if basic parameters, such as period, amplitude and phase are obtained more accurately, when huge time series databases are subjected to automation. In the present thesis work, the theories of four popular period search methods are studied, the strength and weakness of these methods are evaluated by applying it on two survey databases and finally a modified form of cubic spline method is intro- duced to confirm the exact period of variable star. For the classification of new variable stars discovered and entering them in the “General Catalogue of Vari- able Stars” or other databases like “Variable Star Index“, the characteristics of the variability has to be quantified in term of variable star parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ultrafast laser pulses have become an integral part of the toolbox of countless laboratories doing physics, chemistry, and biological research. The work presented here is motivated by a section in the ever-growing, interdisciplinary research towards understanding the fundamental workings of light-matter interactions. Specifically, attosecond pulses can be useful tools to obtain the desired insight. However access to, and the utility of, such pulses is dependent on the generation of intense, few-cycle, carrier-envelope-phase stabilized laser pulses. The presented work can be thought of as a sort of roadmap towards the latter. From the oscillator which provides the broadband seed to amplification methods, the integral pieces necessary for the generation of attosecond pulses are discussed. A range of topics from the fundamentals to design challenges is presented, outfitting the way towards the practical implementation of an intense few-cycle carrier-envelope-phase stabilized laser source.