943 resultados para Methods of Compression
Resumo:
There are various methods to collect adverse events (AEs) in clinical trials. The methods how AEs are collected in vaccine trials is of special interest: solicited reporting can lead to over-reporting events that have little or no biological relationship to the vaccine. We assessed the rate of AEs listed in the package insert for the virosomal hepatitis A vaccine Epaxal(®), comparing data collected by solicited or unsolicited self-reporting. In an open, multi-centre post-marketing study, 2675 healthy travellers received single doses of vaccine administered intramuscularly. AEs were recorded based on solicited and unsolicited questioning during a four-day period after vaccination. A total of 2541 questionnaires could be evaluated (95.0% return rate). Solicited self-reporting resulted in significantly higher (p<0.0001) rates of subjects with AEs than unsolicited reporting, both at baseline (18.9% solicited versus 2.1% unsolicited systemic AEs) and following immunization (29.6% versus 19.3% local AEs; 33.8% versus 18.2% systemic AEs). This could indicate that actual reporting rates of AEs with Epaxal(®) may be substantially lower than described in the package insert. The distribution of AEs differed significantly between the applied methods of collecting AEs. The most common AEs listed in the package insert were reported almost exclusively with solicited questioning. The reporting of local AEs was more likely than that of systemic AEs to be influenced by subjects' sex, age and study centre. Women reported higher rates of AEs than men. The results highlight the need for detailing the methods how vaccine tolerability was reported and assessed.
Resumo:
In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.
Resumo:
A study was conducted on the methods of basis set superposition error (BSSE)-free geometry optimization and frequency calculations in clusters larger than a dimer. In particular, three different counterpoise schemes were critically examined. It was shown that the counterpoise-corrected supermolecule energy can be easily obtained in all the cases by using the many-body partitioning of energy
Resumo:
In vivo localized proton magnetic resonance spectroscopy (1H MRS) became a powerful and unique technique to non-invasively investigate brain metabolism of rodents and humans. The main goal of 1H MRS is the reliable quantification of concentrations of metabolites (neurochemical profile) in a well-defined region of the brain. The availability of very high magnetic field strengths combined with the possibility of acquiring spectra at very short echo time have dramatically increased the number of constituents of the neurochemical profile. The quantification of spectra measured at short echo times is complicated by the presence of macromolecule signals of particular importance at high magnetic fields. An error in the macromolecule estimation can lead to substantial errors in the obtained neurochemical profile. The purpose of the present review is to overview methods of high field 1H MRS with a focus on the metabolite quantification, in particular in handling signals of macromolecules. Three main approaches of handling signals of macromolecules are described, namely mathematical estimation of macromolecules, measurement of macromolecules in vivo, and direct acquisition of the in vivo spectrum without the contribution of macromolecules.
Resumo:
OBJECTIVE: To describe the goals and methods of contemporary public health surveillance and to present the activities of the Observatoire Valaisan de la Santé (OVS), a tool unique in Switzerland to conduct health surveillance for the population of a canton. METHODS: Narrative review and presentation of the OVS. RESULTS: Public health surveillance consists of systematic and continuous collection, analysis, interpretation and dissemination of health data necessary for public health planning. Surveillance is organized according to contemporary public health issues. Switzerland is currently in an era dominated by chronic diseases due to ageing of the population. This "new public health" era is also characterized by the growing importance of health technology, rational risk management, preventive medicine and health promotion, and the central role of the citizen/patient. Information technologies provide access to new health data, but public health surveillance methods need to be adapted. In Switzerland, health surveillance activities are conducted by several public and private bodies, at federal and cantonal levels. The Valais canton has set up the OVS, an integrative, regional, and reactive system to conduct surveillance. CONCLUSION: Public health surveillance provides information useful for public health decisions and actions. It constitutes a key element for public health planning.
Resumo:
PURPOSE: Negative lifestyle factors are known to be associated with increased cardiovascular risk (CVR) in children, but research on their combined impact on a general population of children is sparse. Therefore, we aimed to quantify the combined impact of easily assessable negative lifestyle factors on the CVR scores of randomly selected children after 4 years. METHODS: Of the 540 randomly selected 6- to 13-year-old children, 502 children participated in a baseline health assessment, and 64% were assessed again after 4 years. Measures included anthropometry, fasting blood samples, and a health assessment questionnaire. Participants scored one point for each negative lifestyle factor at baseline: overweight; physical inactivity; high media consumption; little outdoor time; skipping breakfast; and having a parent who has ever smoked, is inactive, or overweight. A CVR score at follow-up was constructed by averaging sex- and age-related z-scores of waist circumference, blood pressure, glucose, inverted high-density lipoprotein, and triglycerides. RESULTS: The age-, sex-, pubertal stage-, and social class-adjusted probabilities (95% confidence interval) for being in the highest CVR score tertile at follow-up for children who had at most one (n = 48), two (n = 64), three (n = 56), four (n = 41), or five or more (n = 14) risky lifestyle factors were 15.4% (8.9-25.3), 24.3% (17.4-32.8), 36.0% (28.6-44.2), 49.8% (38.6-61.0), and 63.5% (47.2-77.2), respectively. CONCLUSIONS: Even in childhood, an accumulation of negative lifestyle factors is associated with higher CVR scores after 4 years. These negative lifestyle factors are easy to assess in clinical practice and allow early detection and prevention of CVR in childhood.
Resumo:
Conventional methods of gene prediction rely on the recognition of DNA-sequence signals, the coding potential or the comparison of a genomic sequence with a cDNA, EST, or protein database. Reasons for limited accuracy in many circumstances are species-specific training and the incompleteness of reference databases. Lately, comparative genome analysis has attracted increasing attention. Several analysis tools that are based on human/mouse comparisons are already available. Here, we present a program for the prediction of protein-coding genes, termed SGP-1 (Syntenic Gene Prediction), which is based on the similarity of homologous genomic sequences. In contrast to most existing tools, the accuracy of SGP-1 depends little on species-specific properties such as codon usage or the nucleotide distribution. SGP-1 may therefore be applied to nonstandard model organisms in vertebrates as well as in plants, without the need for extensive parameter training. In addition to predicting genes in large-scale genomic sequences, the program may be useful to validate gene structure annotations from databases. To this end, SGP-1 output also contains comparisons between predicted and annotated gene structures in HTML format. The program can be accessed via a Web server at http://soft.ice.mpg.de/sgp-1. The source code, written in ANSI C, is available on request from the authors.
Resumo:
The objective of this study is to examine the literature and identify most salient outcomes of early postnatal discharge for women, newborns and the health system. An electronic search strategy was designed including the following sources: Web of Science, Scopus, ProQuest and PubMed/MEDLINE, using the following terms: (early AND discharge) OR (length AND stay) AND (postpartum OR postnatal) AND (effect* OR result OR outcome). Content analysis was used to identify and summarise the findings and methods of the research papers. The evidence available is not enough to either reject or support the practice of early postnatal discharge; different studies have reported different outcomes for women and newborns. The need of systematic clinical research is discussed.
Resumo:
The nutritional and physiological qualities of breast milk make it the best food for newborns, favouring their wellbeing and growth. The implementation of a programme encouraging the breastfeeding of hospitalised newborns in care departments requires specific methods of organisation, as well as constant and adapted support from health professionals.
Resumo:
OBJECTIVE: The presence of minority nonnucleoside reverse transcriptase inhibitor (NNRTI)-resistant HIV-1 variants prior to antiretroviral therapy (ART) has been linked to virologic failure in treatment-naive patients. DESIGN: We performed a large retrospective study to determine the number of treatment failures that could have been prevented by implementing minority drug-resistant HIV-1 variant analyses in ART-naïve patients in whom no NNRTI resistance mutations were detected by routine resistance testing. METHODS: Of 1608 patients in the Swiss HIV Cohort Study, who have initiated first-line ART with two nucleoside reverse transcriptase inhibitors (NRTIs) and one NNRTI before July 2008, 519 patients were eligible by means of HIV-1 subtype, viral load and sample availability. Key NNRTI drug resistance mutations K103N and Y181C were measured by allele-specific PCR in 208 of 519 randomly chosen patients. RESULTS: Minority K103N and Y181C drug resistance mutations were detected in five out of 190 (2.6%) and 10 out of 201 (5%) patients, respectively. Focusing on 183 patients for whom virologic success or failure could be examined, virologic failure occurred in seven out of 183 (3.8%) patients; minority K103N and/or Y181C variants were present prior to ART initiation in only two of those patients. The NNRTI-containing, first-line ART was effective in 10 patients with preexisting minority NNRTI-resistant HIV-1 variant. CONCLUSION: As revealed in settings of case-control studies, minority NNRTI-resistant HIV-1 variants can have an impact on ART. However, the implementation of minority NNRTI-resistant HIV-1 variant analysis in addition to genotypic resistance testing (GRT) cannot be recommended in routine clinical settings. Additional associated risk factors need to be discovered.
Resumo:
Decline in gait stability has been associated with increased fall risk in older adults. Reliable and clinically feasible methods of gait instability assessment are needed. This study evaluated the relative and absolute reliability and concurrent validity of the testing procedure of the clinical version of the Narrow Path Walking Test (NPWT) under single task (ST) and dual task (DT) conditions. Thirty independent community-dwelling older adults (65-87 years) were tested twice. Participants were instructed to walk within the 6-m narrow path without stepping out. Trial time, number of steps, trial velocity, number of step errors, and number of cognitive task errors were determined. Intraclass correlation coefficients (ICCs) were calculated as indices of agreement, and a graphic approach called "mountain plot" was applied to help interpret the direction and magnitude of disagreements between testing procedures. Smallest detectable change and smallest real difference (SRD) were computed to determine clinically relevant improvement at group and individual levels, respectively. Concurrent validity was assessed using Performance Oriented Mobility Assessment Tool (POMA) and the Short Physical Performance Battery (SPPB). Test-retest agreement (ICC1,2) varied from 0.77 to 0.92 in ST and from 0.78 to 0.92 in DT conditions, with no apparent systematic differences between testing procedures demonstrated by the mountain plot graphs. Smallest detectable change and smallest real change were small for motor task performance and larger for cognitive errors. Significant correlations were observed for trial velocity and trial time with POMA and SPPB. The present results indicate that the NPWT testing procedure is highly reliable and reproducible.
Resumo:
In this paper we argue that inventory models are probably not usefulmodels of household money demand because the majority of households does nothold any interest bearing assets. The relevant decision for most people is notthe fraction of assets to be held in interest bearing form, but whether to holdany of such assets at all. The implications of this realization are interesting and important. We find that(a) the elasticity of money demand is very small when the interest rate is small,(b) the probability that a household holds any amount of interest bearing assetsis positively related to the level of financial assets, and (c) the cost ofadopting financial technologies is positively related to age and negatively relatedto the level of education. Unlike the traditional methods of money demand estimation, our methodology allowsfor the estimation of the interest--elasticity at low values of the nominalinterest rate. The finding that the elasticity is very small for interest ratesbelow 5 percent suggests that the welfare costs of inflation are small. At interest rates of 6 percent, the elasticity is close to 0.5. We find thatroughly one half of this elasticity can be attributed to the Baumol--Tobin orintensive margin and half of it can be attributed to the new adopters or extensivemargin. The intensive margin is less important at lower interest rates and moreimportant at higher interest rates.
Resumo:
In this paper we propose a Pyramidal Classification Algorithm,which together with an appropriate aggregation index producesan indexed pseudo-hierarchy (in the strict sense) withoutinversions nor crossings. The computer implementation of thealgorithm makes it possible to carry out some simulation testsby Monte Carlo methods in order to study the efficiency andsensitivity of the pyramidal methods of the Maximum, Minimumand UPGMA. The results shown in this paper may help to choosebetween the three classification methods proposed, in order toobtain the classification that best fits the original structureof the population, provided we have an a priori informationconcerning this structure.
Resumo:
We consider the joint visualization of two matrices which have common rowsand columns, for example multivariate data observed at two time pointsor split accord-ing to a dichotomous variable. Methods of interest includeprincipal components analysis for interval-scaled data, or correspondenceanalysis for frequency data or ratio-scaled variables on commensuratescales. A simple result in matrix algebra shows that by setting up thematrices in a particular block format, matrix sum and difference componentscan be visualized. The case when we have more than two matrices is alsodiscussed and the methodology is applied to data from the InternationalSocial Survey Program.
Resumo:
Game theory is a branch of applied mathematics used to analyze situation where two or more agents are interacting. Originally it was developed as a model for conflicts and collaborations between rational and intelligent individuals. Now it finds applications in social sciences, eco- nomics, biology (particularly evolutionary biology and ecology), engineering, political science, international relations, computer science, and philosophy. Networks are an abstract representation of interactions, dependencies or relationships. Net- works are extensively used in all the fields mentioned above and in many more. Many useful informations about a system can be discovered by analyzing the current state of a network representation of such system. In this work we will apply some of the methods of game theory to populations of agents that are interconnected. A population is in fact represented by a network of players where one can only interact with another if there is a connection between them. In the first part of this work we will show that the structure of the underlying network has a strong influence on the strategies that the players will decide to adopt to maximize their utility. We will then introduce a supplementary degree of freedom by allowing the structure of the population to be modified along the simulations. This modification allows the players to modify the structure of their environment to optimize the utility that they can obtain.