910 resultados para 090108 Satellite Space Vehicle and Missile Design and Testing
Resumo:
BACKGROUND This study assesses the validity and reliability of the Spanish version of DN4 questionnaire as a tool for differential diagnosis of pain syndromes associated to a neuropathic (NP) or somatic component (non-neuropathic pain, NNP). METHODS A study was conducted consisting of two phases: cultural adaptation into the Spanish language by means of conceptual equivalence, including forward and backward translations in duplicate and cognitive debriefing, and testing of psychometric properties in patients with NP (peripheral, central and mixed) and NNP. The analysis of psychometric properties included reliability (internal consistency, inter-rater agreement and test-retest reliability) and validity (ROC curve analysis, agreement with the reference diagnosis and determination of sensitivity, specificity, and positive and negative predictive values in different subsamples according to type of NP). RESULTS A sample of 164 subjects (99 women, 60.4%; age: 60.4 +/- 16.0 years), 94 (57.3%) with NP (36 with peripheral, 32 with central, and 26 with mixed pain) and 70 with NNP was enrolled. The questionnaire was reliable [Cronbach's alpha coefficient: 0.71, inter-rater agreement coefficient: 0.80 (0.71-0.89), and test-retest intra-class correlation coefficient: 0.95 (0.92-0.97)] and valid for a cut-off value > or = 4 points, which was the best value to discriminate between NP and NNP subjects. DISCUSSION This study, representing the first validation of the DN4 questionnaire into another language different than the original, not only supported its high discriminatory value for identification of neuropathic pain, but also provided supplemental psychometric validation (i.e. test-retest reliability, influence of educational level and pain intensity) and showed its validity in mixed pain syndromes.
Resumo:
In standard multivariate statistical analysis common hypotheses of interest concern changes in mean vectors and subvectors. In compositional data analysis it is now well established that compositional change is most readily described in terms of the simplicial operation of perturbation and that subcompositions replace the marginal concept of subvectors. To motivate the statistical developments of this paper we present two challenging compositional problems from food production processes.Against this background the relevance of perturbations and subcompositions can beclearly seen. Moreover we can identify a number of hypotheses of interest involvingthe specification of particular perturbations or differences between perturbations and also hypotheses of subcompositional stability. We identify the two problems as being the counterpart of the analysis of paired comparison or split plot experiments and of separate sample comparative experiments in the jargon of standard multivariate analysis. We then develop appropriate estimation and testing procedures for a complete lattice of relevant compositional hypotheses
Resumo:
Chagas disease is caused by Trypanosoma cruzi, which is mainly transmitted by the faeces of triatomine insects that find favourable environments in poorly constructed houses. Previous studies have documented persistent triatomine infestation in houses in the province of Loja in southern Ecuador despite repeated insecticide and educational interventions. We aim to develop a sustainable strategy for the interruption of Chagas disease transmission by promoting living environments that are designed to prevent colonisation of rural houses by triatomines. This study used positive deviance to inform the design of an anti-triatomine prototype house by identifying knowledge, attitudes and practices used by families that have remained triatomine-free (2010-2012). Positive deviants reported practices that included maintenance of structural elements of the house, fumigation of dwellings and animal shelters, sweeping with "insect repellent" plants and relocation of domestic animals away from the house, among others. Participants favoured construction materials that do not drastically differ from those currently used (adobe walls and tile roofs). They also expressed their belief in a clear connection between a clean house and health. The family's economic dynamics affect space use and must be considered in the prototype's design. Overall, the results indicate a positive climate for the introduction of housing improvements as a protective measure against Chagas disease in this region.
Resumo:
Multiple genome-wide association studies (GWAS) have been performed in HIV-1 infected individuals, identifying common genetic influences on viral control and disease course. Similarly, common genetic correlates of acquisition of HIV-1 after exposure have been interrogated using GWAS, although in generally small samples. Under the auspices of the International Collaboration for the Genomics of HIV, we have combined the genome-wide single nucleotide polymorphism (SNP) data collected by 25 cohorts, studies, or institutions on HIV-1 infected individuals and compared them to carefully matched population-level data sets (a list of all collaborators appears in Note S1 in Text S1). After imputation using the 1,000 Genomes Project reference panel, we tested approximately 8 million common DNA variants (SNPs and indels) for association with HIV-1 acquisition in 6,334 infected patients and 7,247 population samples of European ancestry. Initial association testing identified the SNP rs4418214, the C allele of which is known to tag the HLA-B*57:01 and B*27:05 alleles, as genome-wide significant (p = 3.6×10(-11)). However, restricting analysis to individuals with a known date of seroconversion suggested that this association was due to the frailty bias in studies of lethal diseases. Further analyses including testing recessive genetic models, testing for bulk effects of non-genome-wide significant variants, stratifying by sexual or parenteral transmission risk and testing previously reported associations showed no evidence for genetic influence on HIV-1 acquisition (with the exception of CCR5Δ32 homozygosity). Thus, these data suggest that genetic influences on HIV acquisition are either rare or have smaller effects than can be detected by this sample size.
Resumo:
Two different approaches currently prevail for predicting spatial patterns of species assemblages. The first approach (macroecological modelling, MEM) focuses directly on realised properties of species assemblages, whereas the second approach (stacked species distribution modelling, S-SDM) starts with constituent species to approximate assemblage properties. Here, we propose to unify the two approaches in a single 'spatially-explicit species assemblage modelling' (SESAM) framework. This framework uses relevant species source pool designations, macroecological factors, and ecological assembly rules to constrain predictions of the richness and composition of species assemblages obtained by stacking predictions of individual species distributions. We believe that such a framework could prove useful in many theoretical and applied disciplines of ecology and evolution, both for improving our basic understanding of species assembly across spatio-temporal scales and for anticipating expected consequences of local, regional or global environmental changes. In this paper, we propose such a framework and call for further developments and testing across a broad range of community types in a variety of environments.
Resumo:
This paper investigates the relationship between trade openness and the size of government, both theoretically and empirically. We show that openness can increase the size of governments through two channels: (1) a terms of trade externality, whereby trade lowers the domestic cost of taxation and (2) the demand for insurance, whereby trade raises risk and public transfers. We provide a unified framework for studying and testing these two mechanisms. First, we show how their relative strength depends on a key parameter, the elasticity of substitution between domestic and foreign goods. Second, while the terms of trade externality leads to inefficiently large governments, the increase in public spending due to the demand for insurance is optimal. We show that large volumes of trade may result in welfare losses if the terms of trade externality is strong enough while small volumes of trade are always beneficial. Third, we provide new evidence on the positive association between openness and the size of government and test whether it is consistent with the terms of trade externality or the demand for insurance. Our findings suggest that the positive relationship is remarkably robust and that the terms of trade externality may be the driving force behind it, thus raising warnings that globalization may have led to inefficiently large governments.
Resumo:
Equivalence classes of normal form games are defined using the geometryof correspondences of standard equilibiurm concepts like correlated, Nash,and robust equilibrium or risk dominance and rationalizability. Resultingequivalence classes are fully characterized and compared across differentequilibrium concepts for 2 x 2 games. It is argued that the procedure canlead to broad and game-theoretically meaningful distinctions of games aswell as to alternative ways of viewing and testing equilibrium concepts.Larger games are also briefly considered.
Resumo:
This paper investigates the relationship between trade openness and the size of government, both theoretically and empirically. We show that openness can increase the size of governments through two channels: (1) a terms of trade externality, whereby trade lowers the domestic cost of taxation and (2) the demand for insurance, whereby trade raises risk and public transfers. We provide a unified framework for studying and testing these two mechanisms. First, we show how their relative strength depends on a key parameter, the elasticity of substitution between domestic and foreign goods. Second, while the terms of trade externality leads to inefficiently large governments, the increase in public spending due to the demand for insurance is optimal. We show that large volumes of trade may result in welfare losses if the terms of trade externality is strong enough while small volumes of trade are always beneficial. Third, we provide new evidence on the positive association between openness and the size of government and test whether it is consistent with the terms of trade externality or the demand for insurance. Our findings suggest that the positive relationship is remarkably robust and that the terms of trade externality may be the driving force behind it, thus raising warnings that globalization may have led to inefficiently large governments.
Resumo:
During the development and testing of a radioreceptor assay (RRA) for human IL-1, we have detected and identified the presence of auto-antibodies to IL-1 in normal human plasma (NHP). The RRA is based on the competition between human 125I-labeled rIL-1 alpha and standard or unknown quantities of IL-1 alpha or IL-1 beta for binding to a limited amounts of IL-1 receptor (IL-1R) isolated from the EL4 mouse thymoma cell line. NHP from 20 out of 100 unselected blood donors were found to completely inhibit the binding of 125I-labeled IL-1 alpha to its receptor, suggesting the presence in these NHP samples of either abnormal amounts of IL-1 or of a factor binding to the 125I-labeled IL-1 alpha. Special care was taken to ascertain that the inhibitory factors were antibodies and not soluble IL-1 receptor antagonist. When plasma samples with inhibiting activity were incubated with labeled IL-1 alpha and chromatographed on a Sephadex G200 column, they were found to contain 125I-labeled complexes with an apparent molecular weight of 150-200kD. The IL-1 binding factor could be eliminated from plasma by incubation with protein A-Sepharose, suggesting that it consisted in IgG antibodies directed against IL-1. Furthermore, the antibody nature of the inhibiting factor was confirmed by its binding to purified rIL-1 coupled to Sepharose. Screening of 200 NHP samples by incubation with 100 pg of 125I-labeled IL-1 followed by precipitation with 12% of polyethylene glycol (PEG) confirmed that about 25% of NHP contain detectable IgG antibodies to IL-1 alpha, while only 2% of NHP contain antibodies to IL-1 beta. No correlation between the presence of these anti-IL-1 antibodies and any particular major histocompatibility complex or any pathological conditions was detected. We suggest that all serum samples assayed for IL-1 alpha or IL-1 beta content should be pretested with the PEG precipitation assay described here.
Resumo:
The main objective of the research is to link granular physics with the modelling of rock avalanches. Laboratory experiments consist to find a convenient granular material, i.e. grainsize and physical behaviour, and testing it on simple slope geometry. When the appropriate sliding material is selected, we attempted to model the debris avalanche and the spreading on a slope with different substratum to understand the relationship between the volume and the reach angle, i.e. angle of the line joining the top of the scar and the end of the deposit. For a better understanding of the mass spreading, the deposits are scanned with a laser scanner. Datasets are compared to see how the grain size and volume influence a debris avalanche. The relationship between the roughness and grainsize of the substratum shows that the spreading of the sliding mass is increased when the roughness of the substratum starts to be equivalent or greater than the grainsize of the flowing mass. The runout distance displays a more complex relationship, because a long runout distance implies that grains are less spread. This means that if the substratum is too rough the distance diminishes, as well if it is too smooth because the effect on the apparent friction decreases. Up to now our findings do not permit to validate any previous model (Melosh, 1987; Bagnold 1956).
Resumo:
ABSTRACT: BACKGROUND: Serologic testing algorithms for recent HIV seroconversion (STARHS) provide important information for HIV surveillance. We have shown that a patient's antibody reaction in a confirmatory line immunoassay (INNO-LIATM HIV I/II Score, Innogenetics) provides information on the duration of infection. Here, we sought to further investigate the diagnostic specificity of various Inno-Lia algorithms and to identify factors affecting it. METHODS: Plasma samples of 714 selected patients of the Swiss HIV Cohort Study infected for longer than 12 months and representing all viral clades and stages of chronic HIV-1 infection were tested blindly by Inno-Lia and classified as either incident (up to 12 m) or older infection by 24 different algorithms. Of the total, 524 patients received HAART, 308 had HIV-1 RNA below 50 copies/mL, and 620 were infected by a HIV-1 non-B clade. Using logistic regression analysis we evaluated factors that might affect the specificity of these algorithms. RESULTS: HIV-1 RNA <50 copies/mL was associated with significantly lower reactivity to all five HIV-1 antigens of the Inno-Lia and impaired specificity of most algorithms. Among 412 patients either untreated or with HIV-1 RNA ≥50 copies/mL despite HAART, the median specificity of the algorithms was 96.5% (range 92.0-100%). The only factor that significantly promoted false-incident results in this group was age, with false-incident results increasing by a few percent per additional year. HIV-1 clade, HIV-1 RNA, CD4 percentage, sex, disease stage, and testing modalities exhibited no significance. Results were similar among 190 untreated patients. CONCLUSIONS: The specificity of most Inno-Lia algorithms was high and not affected by HIV-1 variability, advanced disease and other factors promoting false-recent results in other STARHS. Specificity should be good in any group of untreated HIV-1 patients.
Resumo:
El déficit existente a nuestro país con respecto a la disponibilidad de indicadores cuantitativos con los que llevar a término un análisis coyuntural de la actividad industrial regional ha abierto un debate centrado en el estudio de cuál es la metodología más adecuada para elaborar indicadores de estas características. Dentro de este marco, en este trabajo se presentan las principales conclusiones obtenidas en anteriores estudios (Clar, et. al., 1997a, 1997b y 1998) sobre la idoneidad de extender las metodologías que actualmente se están aplicando a las regiones españolas para elaborar indicadores de la actividad industrial mediante métodos indirectos. Estas conclusiones llevan a plantear una estrategia distinta a las que actualmente se vienen aplicando. En concreto, se propone (siguiendo a Israilevich y Kuttner, 1993) un modelo de variables latentes para estimar el indicador de la producción industrial regional. Este tipo de modelo puede especificarse en términos de un modelo statespace y estimarse mediante el filtro de Kalman. Para validar la metodología propuesta se estiman unos indicadores de acuerdo con ella para tres de las cuatro regiones españolas que disponen d¿un Índice de Producción Industrial (IPI) elaborado mediante el método directo (Andalucía, Asturias y el País Vasco) y se comparan con los IPIs publicados (oficiales). Los resultados obtenidos muestran el buen comportamiento de l¿estrategia propuesta, abriendo así una línea de trabajo con la que subsanar el déficit al que se hacía referencia anteriormente
Resumo:
El déficit existente a nuestro país con respecto a la disponibilidad de indicadores cuantitativos con los que llevar a término un análisis coyuntural de la actividad industrial regional ha abierto un debate centrado en el estudio de cuál es la metodología más adecuada para elaborar indicadores de estas características. Dentro de este marco, en este trabajo se presentan las principales conclusiones obtenidas en anteriores estudios (Clar, et. al., 1997a, 1997b y 1998) sobre la idoneidad de extender las metodologías que actualmente se están aplicando a las regiones españolas para elaborar indicadores de la actividad industrial mediante métodos indirectos. Estas conclusiones llevan a plantear una estrategia distinta a las que actualmente se vienen aplicando. En concreto, se propone (siguiendo a Israilevich y Kuttner, 1993) un modelo de variables latentes para estimar el indicador de la producción industrial regional. Este tipo de modelo puede especificarse en términos de un modelo statespace y estimarse mediante el filtro de Kalman. Para validar la metodología propuesta se estiman unos indicadores de acuerdo con ella para tres de las cuatro regiones españolas que disponen d¿un Índice de Producción Industrial (IPI) elaborado mediante el método directo (Andalucía, Asturias y el País Vasco) y se comparan con los IPIs publicados (oficiales). Los resultados obtenidos muestran el buen comportamiento de l¿estrategia propuesta, abriendo así una línea de trabajo con la que subsanar el déficit al que se hacía referencia anteriormente
Resumo:
The research reported in this series of article aimed at (1) automating the search of questioned ink specimens in ink reference collections and (2) at evaluating the strength of ink evidence in a transparent and balanced manner. These aims require that ink samples are analysed in an accurate and reproducible way and that they are compared in an objective and automated way. This latter requirement is due to the large number of comparisons that are necessary in both scenarios. A research programme was designed to (a) develop a standard methodology for analysing ink samples in a reproducible way, (b) comparing automatically and objectively ink samples and (c) evaluate the proposed methodology in forensic contexts. This report focuses on the last of the three stages of the research programme. The calibration and acquisition process and the mathematical comparison algorithms were described in previous papers [C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science-Part I: Development of a quality assurance process for forensic ink analysis by HPTLC, Forensic Sci. Int. 185 (2009) 29-37; C. Neumann, P. Margot, New perspectives in the use of ink evidence in forensic science- Part II: Development and testing of mathematical algorithms for the automatic comparison of ink samples analysed by HPTLC, Forensic Sci. Int. 185 (2009) 38-50]. In this paper, the benefits and challenges of the proposed concepts are tested in two forensic contexts: (1) ink identification and (2) ink evidential value assessment. The results show that different algorithms are better suited for different tasks. This research shows that it is possible to build digital ink libraries using the most commonly used ink analytical technique, i.e. high-performance thin layer chromatography, despite its reputation of lacking reproducibility. More importantly, it is possible to assign evidential value to ink evidence in a transparent way using a probabilistic model. It is therefore possible to move away from the traditional subjective approach, which is entirely based on experts' opinion, and which is usually not very informative. While there is room for the improvement, this report demonstrates the significant gains obtained over the traditional subjective approach for the search of ink specimens in ink databases, and the interpretation of their evidential value.
Resumo:
The film depicts period traffic congestion, sharp and winding sections of road, steep hills making trucks slow to a crawl, and dangerous vehicle and pedestrian crossings, all important reasons why highway design and safety improvements, and highway relocation were needed. In fact, when the film was produced, U.S. 30 or the Lincoln Highway was the busiest primary road in Iowa; and the section between State Center and Boone was deemed “critical,” meaning it was considered dangerous by the ISHC’s Efficiency Standards.