107 resultados para likelihood to publication
Resumo:
BACKGROUND: Most migrant studies have compared health characteristics between migrants and nationals of the host country. We aimed at comparing health characteristics of migrants with nationals from their home country. METHODS: Portuguese national health survey (2005-6; 30,173 participants aged 18-75 years) and four national health surveys conducted in Switzerland (2002, 2004, 2007 and 2011, totalling 1,170 Portuguese migrants of the same age range). Self-reported data on length of stay, cardiovascular risk factors, healthcare use and health status were collected. RESULTS: Resident Portuguese were significantly older and more educated than migrants. Resident Portuguese had a higher mean BMI and prevalence of obesity than migrants. Resident Portuguese also reported more frequently being hypertensive and having their blood pressure screened within the last year. On the contrary, migrant Portuguese were more frequently smokers, had a medical visit in the previous year more frequently and self-rated their health higher than resident Portuguese. After adjustment for age, gender, marital status and education, migrants had a higher likelihood of smoking, of having a medical visit the previous year, and of self-rating their current health as good or very good than resident Portuguese. Compared to Portuguese residents, cholesterol screening in the previous year was more common only among migrants living in Switzerland for more than 17 years. CONCLUSION: Portuguese migrants in Switzerland do not differ substantially from resident Portuguese regarding most cardiovascular risk factors. Migrants consider themselves healthier than Portuguese residents and more often had a recent medical visit.
Resumo:
IMPORTANCE: The discontinuation of randomized clinical trials (RCTs) raises ethical concerns and often wastes scarce research resources. The epidemiology of discontinued RCTs, however, remains unclear. OBJECTIVES: To determine the prevalence, characteristics, and publication history of discontinued RCTs and to investigate factors associated with RCT discontinuation due to poor recruitment and with nonpublication. DESIGN AND SETTING: Retrospective cohort of RCTs based on archived protocols approved by 6 research ethics committees in Switzerland, Germany, and Canada between 2000 and 2003. We recorded trial characteristics and planned recruitment from included protocols. Last follow-up of RCTs was April 27, 2013. MAIN OUTCOMES AND MEASURES: Completion status, reported reasons for discontinuation, and publication status of RCTs as determined by correspondence with the research ethics committees, literature searches, and investigator surveys. RESULTS: After a median follow-up of 11.6 years (range, 8.8-12.6 years), 253 of 1017 included RCTs were discontinued (24.9% [95% CI, 22.3%-27.6%]). Only 96 of 253 discontinuations (37.9% [95% CI, 32.0%-44.3%]) were reported to ethics committees. The most frequent reason for discontinuation was poor recruitment (101/1017; 9.9% [95% CI, 8.2%-12.0%]). In multivariable analysis, industry sponsorship vs investigator sponsorship (8.4% vs 26.5%; odds ratio [OR], 0.25 [95% CI, 0.15-0.43]; P < .001) and a larger planned sample size in increments of 100 (-0.7%; OR, 0.96 [95% CI, 0.92-1.00]; P = .04) were associated with lower rates of discontinuation due to poor recruitment. Discontinued trials were more likely to remain unpublished than completed trials (55.1% vs 33.6%; OR, 3.19 [95% CI, 2.29-4.43]; P < .001). CONCLUSIONS AND RELEVANCE: In this sample of trials based on RCT protocols from 6 research ethics committees, discontinuation was common, with poor recruitment being the most frequently reported reason. Greater efforts are needed to ensure the reporting of trial discontinuation to research ethics committees and the publication of results of discontinued trials.
Resumo:
Background: 7 - 20% of the childrenreport suspected allergic reactionsto anti-infectious drugs, thebetalactams being the most frequentlyinvolved. Studies based on skin andchallenge tests have shown that 2 -60% (mean: 10 - 15%) of the childrenwith suspected betalactam hypersensitivitywere really allergic to betalactams,and that the likelihood ofbetalactam hypersensitivity increasedwith the severity and/or the earlinessof the reactions. Methods: We reviewedthe records of 1,865 childrenexplored for suspected betalactam hypersensitivitybetween December1990 and July 2009. The objectivewas to confirm or rule-out the diagnosisof betalactam hypersensitivity, toevaluate the diagnostic value of immediateand non-immediate-readingskin tests with betalactams, and to determinerisk factors for betalactamhypersensitivity. In those childrenskin tests were first performed andthen challenges with the suspectedbetalactams were performed in mostchildren with negative skin test results.Results: 1431 children had acomplete allergological work-up. Fifteen-nine per cent of those childrenwere diagnosed allergic to betalactamsby means of skin tests (7.2%),challenge tests (7.7%), and clinicalhistory (0.9%) respectively. Immediatetype betalactam hypersensitivitywas diagnosed in 3.5% and non-immediatetype in 12.4% of the children.Skin tests diagnosed 86% and 31.6%of immediate and non-immediatesensitizations respectively. Cross-reactivityamong betalactams was diagnosedin 76% of the children with immediatehypersensitivity and in14.7% of the children with non-immediatehypersensitivity. The numberof children diagnosed allergic tobetalactams decreased with time betweenthe reaction and the allergologicalwork-up. Finally, age, sex andpersonal atopy were not significantrisk factors for betalactam allergy.Conclusion: This study, in a verylarge number of children, confirmsthat only a few children with suspectedbetalactam hypersensitivityare really allergic to betalactams, andthat the likelihood of betalactam allergyand cross-reactivity with otherbetalactams increases with the severityand/or the earliness of the reactions.We also confirm that immediate-reading skin tests have a good diagnosticvalue, and that the diagnosticvalue of non-immediate-reading skintests is low, the diagnosis of non-immediatehypersensitivity to betalactamsin children beeing mainly basedon challenge tests.
Resumo:
SummaryDiscrete data arise in various research fields, typically when the observations are count data.I propose a robust and efficient parametric procedure for estimation of discrete distributions. The estimation is done in two phases. First, a very robust, but possibly inefficient, estimate of the model parameters is computed and used to indentify outliers. Then the outliers are either removed from the sample or given low weights, and a weighted maximum likelihood estimate (WML) is computed.The weights are determined via an adaptive process such that if the data follow the model, then asymptotically no observation is downweighted.I prove that the final estimator inherits the breakdown point of the initial one, and that its influence function at the model is the same as the influence function of the maximum likelihood estimator, which strongly suggests that it is asymptotically fully efficient.The initial estimator is a minimum disparity estimator (MDE). MDEs can be shown to have full asymptotic efficiency, and some MDEs have very high breakdown points and very low bias under contamination. Several initial estimators are considered, and the performances of the WMLs based on each of them are studied.It results that in a great variety of situations the WML substantially improves the initial estimator, both in terms of finite sample mean square error and in terms of bias under contamination. Besides, the performances of the WML are rather stable under a change of the MDE even if the MDEs have very different behaviors.Two examples of application of the WML to real data are considered. In both of them, the necessity for a robust estimator is clear: the maximum likelihood estimator is badly corrupted by the presence of a few outliers.This procedure is particularly natural in the discrete distribution setting, but could be extended to the continuous case, for which a possible procedure is sketched.RésuméLes données discrètes sont présentes dans différents domaines de recherche, en particulier lorsque les observations sont des comptages.Je propose une méthode paramétrique robuste et efficace pour l'estimation de distributions discrètes. L'estimation est faite en deux phases. Tout d'abord, un estimateur très robuste des paramètres du modèle est calculé, et utilisé pour la détection des données aberrantes (outliers). Cet estimateur n'est pas nécessairement efficace. Ensuite, soit les outliers sont retirés de l'échantillon, soit des faibles poids leur sont attribués, et un estimateur du maximum de vraisemblance pondéré (WML) est calculé.Les poids sont déterminés via un processus adaptif, tel qu'asymptotiquement, si les données suivent le modèle, aucune observation n'est dépondérée.Je prouve que le point de rupture de l'estimateur final est au moins aussi élevé que celui de l'estimateur initial, et que sa fonction d'influence au modèle est la même que celle du maximum de vraisemblance, ce qui suggère que cet estimateur est pleinement efficace asymptotiquement.L'estimateur initial est un estimateur de disparité minimale (MDE). Les MDE sont asymptotiquement pleinement efficaces, et certains d'entre eux ont un point de rupture très élevé et un très faible biais sous contamination. J'étudie les performances du WML basé sur différents MDEs.Le résultat est que dans une grande variété de situations le WML améliore largement les performances de l'estimateur initial, autant en terme du carré moyen de l'erreur que du biais sous contamination. De plus, les performances du WML restent assez stables lorsqu'on change l'estimateur initial, même si les différents MDEs ont des comportements très différents.Je considère deux exemples d'application du WML à des données réelles, où la nécessité d'un estimateur robuste est manifeste : l'estimateur du maximum de vraisemblance est fortement corrompu par la présence de quelques outliers.La méthode proposée est particulièrement naturelle dans le cadre des distributions discrètes, mais pourrait être étendue au cas continu.
Resumo:
BACKGROUND: A number of medical journals have developed policies for accelerated publication of articles judged by the authors, the editors or the peer reviewers to be of special importance. However, the validity of these judgements is unknown. We therefore compared the importance of articles published on a "fast track" with those published in the usual way. METHODS: We identified 12 "case" articles--6 articles from the New England Journal of Medicine that were prereleased on the journal's Web site before publication in print and 6 "fast-tracked" articles from The Lancet. We then identified 12 "control" articles matched to the case articles according to journal, disease or procedure of focus, theme area and year of publication. Forty-two general internists rated the articles, using 10-point scales, on dimensions addressing the articles' importance, ease of applicability and impact on health outcomes. RESULTS: For each dimension, the mean score for the case articles was significantly higher than the mean score for the control articles: importance to clinical practice 7.6 v. 7.1 respectively (p = 0.001), importance from a public health perspective 6.5 v. 6.0 (p < 0.001), contribution to advancement of medical knowledge 6.2 v. 5.8 (p < 0.001), ease of applicability in practice 7.0 v. 6.5 (p < 0.001), potential impact on health outcomes 6.5 v. 5.9 (p < 0.001). Despite these general findings, in 5 of the 12 matched pairs of articles the control article had a higher mean score than the case article across all the dimensions. INTERPRETATION: The accelerated publication practices of 2 leading medical journals targeted articles that, on average, had slightly higher importance scores than similar articles published in the usual way. However, our finding of higher importance scores for control articles in 5 of the 12 matched pairs shows that current journal practices for selecting articles for expedited publication are inconsistent.
Resumo:
Adult animals can eavesdrop on behavioral interactions between potential opponents to assess their competitive ability and motivation to contest resources without interacting directly with them. Surprisingly, eavesdropping is not yet considered as an important factor used to resolve conflicts between family members. In this study, we show that nestling barn owls (Tyto alba) competing for food eavesdrop on nestmates' vocal interactions to assess the dominance status and food needs of opponents. During a first training playback session, we broadcasted to singleton bystander nestlings a simulated vocal interaction between 2 prerecorded individuals, 1 relatively old (i.e., senior) and 1 younger nestling (i.e., junior). One playback individual, the "responder," called systematically just after the "initiator" playback individual, hence displaying a higher hunger level. To test whether nestlings have eavesdropped on this interaction, we broadcasted the same prerecorded individuals separately in a subsequent playback test session. Nestlings vocalized more rapidly after former initiators' than responders' calls and they produced more calls when the broadcasted individual was formerly a junior initiator. They chiefly challenged vocally juniors and initiators against whom the likelihood of winning a vocal contest is higher. Owlets, therefore, identified the age hierarchy between 2 competitors based on their vocalizations. They also memorized the dynamics of competitors' previous vocal interactions, and used this information to optimally adjust signaling level once interacting with only 1 of the competitor. We conclude that siblings eavesdrop on one another to resolve conflicts over parental resources.
Resumo:
BACKGROUND: The yeast Schizosaccharomyces pombe is frequently used as a model for studying the cell cycle. The cells are rod-shaped and divide by medial fission. The process of cell division, or cytokinesis, is controlled by a network of signaling proteins called the Septation Initiation Network (SIN); SIN proteins associate with the SPBs during nuclear division (mitosis). Some SIN proteins associate with both SPBs early in mitosis, and then display strongly asymmetric signal intensity at the SPBs in late mitosis, just before cytokinesis. This asymmetry is thought to be important for correct regulation of SIN signaling, and coordination of cytokinesis and mitosis. In order to study the dynamics of organelles or large protein complexes such as the spindle pole body (SPB), which have been labeled with a fluorescent protein tag in living cells, a number of the image analysis problems must be solved; the cell outline must be detected automatically, and the position and signal intensity associated with the structures of interest within the cell must be determined. RESULTS: We present a new 2D and 3D image analysis system that permits versatile and robust analysis of motile, fluorescently labeled structures in rod-shaped cells. We have designed an image analysis system that we have implemented as a user-friendly software package allowing the fast and robust image-analysis of large numbers of rod-shaped cells. We have developed new robust algorithms, which we combined with existing methodologies to facilitate fast and accurate analysis. Our software permits the detection and segmentation of rod-shaped cells in either static or dynamic (i.e. time lapse) multi-channel images. It enables tracking of two structures (for example SPBs) in two different image channels. For 2D or 3D static images, the locations of the structures are identified, and then intensity values are extracted together with several quantitative parameters, such as length, width, cell orientation, background fluorescence and the distance between the structures of interest. Furthermore, two kinds of kymographs of the tracked structures can be established, one representing the migration with respect to their relative position, the other representing their individual trajectories inside the cell. This software package, called "RodCellJ", allowed us to analyze a large number of S. pombe cells to understand the rules that govern SIN protein asymmetry. CONCLUSIONS: "RodCell" is freely available to the community as a package of several ImageJ plugins to simultaneously analyze the behavior of a large number of rod-shaped cells in an extensive manner. The integration of different image-processing techniques in a single package, as well as the development of novel algorithms does not only allow to speed up the analysis with respect to the usage of existing tools, but also accounts for higher accuracy. Its utility was demonstrated on both 2D and 3D static and dynamic images to study the septation initiation network of the yeast Schizosaccharomyces pombe. More generally, it can be used in any kind of biological context where fluorescent-protein labeled structures need to be analyzed in rod-shaped cells. AVAILABILITY: RodCellJ is freely available under http://bigwww.epfl.ch/algorithms.html, (after acceptance of the publication).
Resumo:
Introduction: According to guidelines, patients with coronary artery disease (CAD) should undergo revascularization if myocardial ischemia is present. While coronary angiography (CXA) allows the morphological assessment of CAD, the fractional flow reserve (FFR) has proved to be a complementary invasive test to assess the functional significance of CAD, i.e. to detect ischemia. Perfusion Cardiac Magnetic Resonance (CMR) has turned out to be a robust non-invasive technique to assess myocardial ischemia. The objective: is to compare the cost-effectiveness ratio - defined as the costs per patient correctly diagnosed - of two algorithms used to diagnose hemodynamically significant CAD in relation to the pretest likelihood of CAD: 1) aCMRto assess ischemia before referring positive patients to CXA (CMR + CXA), 2) a CXA in all patients combined with a FFR test in patients with angiographically positive stenoses (CXA + FFR). Methods: The costs, evaluated from the health care system perspective in the Swiss, German, the United Kingdom (UK) and the United States (US) contexts, included public prices of the different tests considered as outpatient procedures, complications' costs and costs induced by diagnosis errors (false negative). The effectiveness criterion wasthe ability to accurately identify apatient with significantCAD.Test performancesused in the model were based on the clinical literature. Using a mathematical model, we compared the cost-effectiveness ratio for both algorithms for hypothetical patient cohorts with different pretest likelihood of CAD. Results: The cost-effectiveness ratio decreased hyperbolically with increasing pretest likelihood of CAD for both strategies. CMR + CXA and CXA + FFR were equally costeffective at a pretest likelihood of CAD of 62% in Switzerland, 67% in Germany, 83% in the UK and 84% in the US with costs of CHF 5'794, Euros 1'472, £ 2'685 and $ 2'126 per patient correctly diagnosed. Below these thresholds, CMR + CXA showed lower costs per patient correctly diagnosed than CXA + FFR. Implications for the health care system/professionals/patients/society These results facilitate decision making for the clinical use of new generations of imaging procedures to detect ischemia. They show to what extent the cost-effectiveness to diagnose CAD depends on the prevalence of the disease.
Resumo:
BACKGROUND AND STUDY AIMS: The current gold standard in Barrett's esophagus monitoring consists of four-quadrant biopsies every 1-2 cm in accordance with the Seattle protocol. Adding brush cytology processed by digital image cytometry (DICM) may further increase the detection of patients with Barrett's esophagus who are at risk of neoplasia. The aim of the present study was to assess the additional diagnostic value and accuracy of DICM when added to the standard histological analysis in a cross-sectional multicenter study of patients with Barrett's esophagus in Switzerland. METHODS: One hundred sixty-four patients with Barrett's esophagus underwent 239 endoscopies with biopsy and brush cytology. DICM was carried out on 239 cytology specimens. Measures of the test accuracy of DICM (relative risk, sensitivity, specificity, likelihood ratios) were obtained by dichotomizing the histopathology results (high-grade dysplasia or adenocarcinoma vs. all others) and DICM results (aneuploidy/intermediate pattern vs. diploidy). RESULTS: DICM revealed diploidy in 83% of 239 endoscopies, an intermediate pattern in 8.8%, and aneuploidy in 8.4%. An intermediate DICM result carried a relative risk (RR) of 12 and aneuploidy a RR of 27 for high-grade dysplasia/adenocarcinoma. Adding DICM to the standard biopsy protocol, a pathological cytometry result (aneuploid or intermediate) was found in 25 of 239 endoscopies (11%; 18 patients) with low-risk histology (no high-grade dysplasia or adenocarcinoma). During follow-up of 14 of these 18 patients, histological deterioration was seen in 3 (21%). CONCLUSION: DICM from brush cytology may add important information to a standard biopsy protocol by identifying a subgroup of BE-patients with high-risk cellular abnormalities.
Resumo:
Swain corrects the chi-square overidentification test (i.e., likelihood ratio test of fit) for structural equation models whethr with or without latent variables. The chi-square statistic is asymptotically correct; however, it does not behave as expected in small samples and/or when the model is complex (cf. Herzog, Boomsma, & Reinecke, 2007). Thus, particularly in situations where the ratio of sample size (n) to the number of parameters estimated (p) is relatively small (i.e., the p to n ratio is large), the chi-square test will tend to overreject correctly specified models. To obtain a closer approximation to the distribution of the chi-square statistic, Swain (1975) developed a correction; this scaling factor, which converges to 1 asymptotically, is multiplied with the chi-square statistic. The correction better approximates the chi-square distribution resulting in more appropriate Type 1 reject error rates (see Herzog & Boomsma, 2009; Herzog, et al., 2007).
Resumo:
ABSTRACT (FRENCH)Ce travail de thèse basé sur le système visuel chez les sujets sains et chez les patients schizophrènes, s'articule autour de trois articles scientifiques publiés ou en cours de publication. Ces articles traitent des sujets suivants : le premier article présente une nouvelle méthode de traitement des composantes physiques des stimuli (luminance et fréquence spatiale). Le second article montre, à l'aide d'analyses de données EEG, un déficit de la voie magnocellulaire dans le traitement visuel des illusions chez les patients schizophrènes. Ceci est démontré par l'absence de modulation de la composante PI chez les patients schizophrènes contrairement aux sujets sains. Cette absence est induite par des stimuli de type illusion Kanizsa de différentes excentricités. Finalement, le troisième article, également à l'aide de méthodes de neuroimagerie électrique (EEG), montre que le traitement des contours illusoires se trouve dans le complexe latéro-occipital (LOC), à l'aide d'illusion « misaligned gratings ». De plus il révèle que les activités démontrées précédemment dans les aires visuelles primaires sont dues à des inférences « top- down ».Afin de permettre la compréhension de ces trois articles, l'introduction de ce manuscrit présente les concepts essentiels. De plus des méthodes d'analyses de temps-fréquence sont présentées. L'introduction est divisée en quatre parties : la première présente le système visuel depuis les cellules retino-corticales aux deux voix du traitement de l'information en passant par les régions composant le système visuel. La deuxième partie présente la schizophrénie par son diagnostic, ces déficits de bas niveau de traitement des stimuli visuel et ces déficits cognitifs. La troisième partie présente le traitement des contours illusoires et les trois modèles utilisés dans le dernier article. Finalement, les méthodes de traitement des données EEG seront explicitées, y compris les méthodes de temps-fréquences.Les résultats des trois articles sont présentés dans le chapitre éponyme (du même nom). De plus ce chapitre comprendra les résultats obtenus à l'aide des méthodes de temps-fréquenceFinalement, la discussion sera orientée selon trois axes : les méthodes de temps-fréquence ainsi qu'une proposition de traitement de ces données par une méthode statistique indépendante de la référence. La discussion du premier article en montrera la qualité du traitement de ces stimuli. La discussion des deux articles neurophysiologiques, proposera de nouvelles d'expériences afin d'affiner les résultats actuels sur les déficits des schizophrènes. Ceci pourrait permettre d'établir un marqueur biologique fiable de la schizophrénie.ABSTRACT (ENGLISH)This thesis focuses on the visual system in healthy subjects and schizophrenic patients. To address this research, advanced methods of analysis of electroencephalographic (EEG) data were used and developed. This manuscript is comprised of three scientific articles. The first article showed a novel method to control the physical features of visual stimuli (luminance and spatial frequencies). The second article showed, using electrical neuroimaging of EEG, a deficit in spatial processing associated with the dorsal pathway in chronic schizophrenic patients. This deficit was elicited by an absent modulation of the PI component in terms of response strength and topography as well as source estimations. This deficit was orthogonal to the preserved ability to process Kanizsa-type illusory contours. Finally, the third article resolved ongoing debates concerning the neural mechanism mediating illusory contour sensitivity by using electrical neuroimaging to show that the first differentiation of illusory contour presence vs. absence is localized within the lateral occipital complex. This effect was subsequent to modulations due to the orientation of misaligned grating stimuli. Collectively, these results support a model where effects in V1/V2 are mediated by "top-down" modulation from the LOC.To understand these three articles, the Introduction of this thesis presents the major concepts used in these articles. Additionally, a section is devoted to time-frequency analysis methods not presented in the articles themselves. The introduction is divided in four parts. The first part presents three aspects of the visual system: cellular, regional, and its functional interactions. The second part presents an overview of schizophrenia and its sensoiy-cognitive deficits. The third part presents an overview of illusory contour processing and the three models examined in the third article. Finally, advanced analysis methods for EEG are presented, including time- frequency methodology.The Introduction is followed by a synopsis of the main results in the articles as well as those obtained from the time-frequency analyses.Finally, the Discussion chapter is divided along three axes. The first axis discusses the time frequency analysis and proposes a novel statistical approach that is independent of the reference. The second axis contextualizes the first article and discusses the quality of the stimulus control and direction for further improvements. Finally, both neurophysiologic articles are contextualized by proposing future experiments and hypotheses that may serve to improve our understanding of schizophrenia on the one hand and visual functions more generally.
Resumo:
Calceology is the study of recovered archaeological leather footwear and is comprised of conservation, documentation and identification of leather shoe components and shoe styles. Recovered leather shoes are complex artefacts that present technical, stylistic and personal information about the culture and people that used them. The current method in calceological research for typology and chronology is by comparison with parallel examples, though its use poses problems by an absence of basic definitions and the lack of a taxonomic hierarchy. The research findings of the primary cutting patterns, used for making all leather footwear, are integrated with the named style method and the Goubitz notation, resulting in a combined methodology as a basis for typological organisation for recovered footwear and a chronology for named shoe styles. The history of calceological research is examined in chapter two and is accompanied by a review of methodological problems as seen in the literature. Through the examination of various documentation and research techniques used during the history of calceological studies, the reasons why a standard typology and methodology failed to develop are investigated. The variety and continual invention of a new research method for each publication of a recovered leather assemblage hindered the development of a single standard methodology. Chapter three covers the initial research with the database through which the primary cutting patterns were identified and the named styles were defined. The chronological span of each named style was established through iterative cross-site sedation and named style comparisons. The technical interpretation of the primary cutting patterns' consistent use is due to constraints imposed by the leather and the forms needed to cover the foot. Basic parts of the shoe patterns and the foot are defined, plus terms provided for identifying the key points for pattern making. Chapter four presents the seventeen primary cutting patterns and their sub-types, these are divided into three main groups: six integral soled patterns, four hybrid soled patterns and seven separately soled patterns. Descriptions of the letter codes, pattern layout, construction principle, closing seam placement and list of sub-types are included in the descriptions of each primary cutting pattern. The named shoe styles and their relative chronology are presented in chapter five. Nomenclature for the named styles is based on the find location of the first published example plus the primary cutting pattern code letter. The named styles are presented in chronological order from Prehistory through to the late 16th century. Short descriptions of the named styles are given and illustrated with examples of recovered archaeological leather footwear, reconstructions of archaeological shoes and iconographical sources. Chapter six presents documentation of recovered archaeological leather using the Goubitz notation, an inventory and description of style elements and fastening methods used for defining named shoe styles, technical information about sole/upper constructions and the consequences created by the use of lasts and sewing forms for style identification and fastening placement in relation to the instep point. The chapter concludes with further technical information about the implications for researchers about shoemaking, pattern making and reconstructive archaeology. The conclusion restates the original research question of why a group of primary cutting patterns appear to have been used consistently throughout the European archaeological record. The quantitative and qualitative results from the database show the use of these patterns but it is the properties of the leather that imposes the use of the primary cutting patterns. The combined methodology of primary pattern identification, named style and artefact registration provides a framework for calceological research.
Resumo:
BACKGROUND: Strict definition of invasive aspergillosis (IA) cases is required to allow precise conclusions about the efficacy of antifungal therapy. The Global Comparative Aspergillus Study (GCAS) compared voriconazole to amphotericin B (AmB) deoxycholate for the primary therapy of IA. Because predefined definitions used for this trial were substantially different from the consensus definitions proposed by the European Organization for Research and Treatment of Cancer/Mycoses Study Group in 2008, we recategorized the 379 episodes of the GCAS according to the later definitions. METHODS: The objectives were to assess the impact of the current definitions on the classification of the episodes and to provide comparative efficacy for probable/proven and possible IA in patients treated with either voriconazole or AmB. In addition to original data, we integrated the results of baseline galactomannan serum levels obtained from 249 (65.7%) frozen samples. The original response assessment was accepted unchanged. RESULTS: Recategorization allowed 59 proven, 178 probable, and 106 possible IA cases to be identified. A higher favorable 12-week response rate was obtained with voriconazole (54.7%) than with AmB (29.9%) (P < .0001). Survival was higher for voriconazole for mycologically documented (probable/proven) IA (70.2%) than with AmB (54.9%) (P = .010). Higher response rates were obtained in possible IA treated with voriconazole vs AmB with the same magnitude of difference (26.2%; 95% confidence interval [CI], 7.2%-45.3%) as in mycologically documented episodes (24.3%; 95% CI, 11.9%-36.7%), suggesting that possible cases are true IA. CONCLUSIONS: Recategorization resulted in a better identification of the episodes and confirmed the higher efficacy of voriconazole over AmB deoxycholate in mycologically documented IA.
Resumo:
The package HIERFSTAT for the statistical software R, created by the R Development Core Team, allows the estimate of hierarchical F-statistics from a hierarchy with any numbers of levels. In addition, it allows testing the statistical significance of population differentiation for these different levels, using a generalized likelihood-ratio test. The package HIERFSTAT is available at http://www.unil.ch/popgen/softwares/hierfstat.htm.