183 resultados para 1600-1681.

em Université de Lausanne, Switzerland


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Calceology is the study of recovered archaeological leather footwear and is comprised of conservation, documentation and identification of leather shoe components and shoe styles. Recovered leather shoes are complex artefacts that present technical, stylistic and personal information about the culture and people that used them. The current method in calceological research for typology and chronology is by comparison with parallel examples, though its use poses problems by an absence of basic definitions and the lack of a taxonomic hierarchy. The research findings of the primary cutting patterns, used for making all leather footwear, are integrated with the named style method and the Goubitz notation, resulting in a combined methodology as a basis for typological organisation for recovered footwear and a chronology for named shoe styles. The history of calceological research is examined in chapter two and is accompanied by a review of methodological problems as seen in the literature. Through the examination of various documentation and research techniques used during the history of calceological studies, the reasons why a standard typology and methodology failed to develop are investigated. The variety and continual invention of a new research method for each publication of a recovered leather assemblage hindered the development of a single standard methodology. Chapter three covers the initial research with the database through which the primary cutting patterns were identified and the named styles were defined. The chronological span of each named style was established through iterative cross-site sedation and named style comparisons. The technical interpretation of the primary cutting patterns' consistent use is due to constraints imposed by the leather and the forms needed to cover the foot. Basic parts of the shoe patterns and the foot are defined, plus terms provided for identifying the key points for pattern making. Chapter four presents the seventeen primary cutting patterns and their sub-types, these are divided into three main groups: six integral soled patterns, four hybrid soled patterns and seven separately soled patterns. Descriptions of the letter codes, pattern layout, construction principle, closing seam placement and list of sub-types are included in the descriptions of each primary cutting pattern. The named shoe styles and their relative chronology are presented in chapter five. Nomenclature for the named styles is based on the find location of the first published example plus the primary cutting pattern code letter. The named styles are presented in chronological order from Prehistory through to the late 16th century. Short descriptions of the named styles are given and illustrated with examples of recovered archaeological leather footwear, reconstructions of archaeological shoes and iconographical sources. Chapter six presents documentation of recovered archaeological leather using the Goubitz notation, an inventory and description of style elements and fastening methods used for defining named shoe styles, technical information about sole/upper constructions and the consequences created by the use of lasts and sewing forms for style identification and fastening placement in relation to the instep point. The chapter concludes with further technical information about the implications for researchers about shoemaking, pattern making and reconstructive archaeology. The conclusion restates the original research question of why a group of primary cutting patterns appear to have been used consistently throughout the European archaeological record. The quantitative and qualitative results from the database show the use of these patterns but it is the properties of the leather that imposes the use of the primary cutting patterns. The combined methodology of primary pattern identification, named style and artefact registration provides a framework for calceological research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abiotic factors are considered strong drivers of species distribution and assemblages. Yet these spatial patterns are also influenced by biotic interactions. Accounting for competitors or facilitators may improve both the fit and the predictive power of species distribution models (SDMs). We investigated the influence of a dominant species, Empetrum nigrum ssp. hermaphroditum, on the distribution of 34 subordinate species in the tundra of northern Norway. We related SDM parameters of those subordinate species to their functional traits and their co-occurrence patterns with E. hermaphroditum across three spatial scales. By combining both approaches, we sought to understand whether these species may be limited by competitive interactions and/or benefit from habitat conditions created by the dominant species. The model fit and predictive power increased for most species when the frequency of occurrence of E. hermaphroditum was included in the SDMs as a predictor. The largest increase was found for species that 1) co-occur most of the time with E. hermaphroditum, both at large (i.e. 750 m) and small spatial scale (i.e. 2 m) or co-occur with E. hermaphroditum at large scale but not at small scale and 2) have particularly low or high leaf dry matter content (LDMC). Species that do not co-occur with E. hermaphroditum at the smallest scale are generally palatable herbaceous species with low LDMC, thus showing a weak ability to tolerate resource depletion that is directly or indirectly induced by E. hermaphroditum. Species with high LDMC, showing a better aptitude to face resource depletion and grazing, are often found in the proximity of E. hermaphroditum. Our results are consistent with previous findings that both competition and facilitation structure plant distribution and assemblages in the Arctic tundra. The functional and co-occurrence approaches used were complementary and provided a deeper understanding of the observed patterns by refinement of the pool of potential direct and indirect ecological effects of E. hermaphroditum on the distribution of subordinate species. Our correlative study would benefit being complemented by experimental approaches.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Raman spectroscopy has been applied to characterize fiber dyes and determine the discriminating ability of the method. Black, blue, and red acrylic, cotton, and wool samples were analyzed. Four excitation sources were used to obtain complementary responses in the case of fluorescent samples. Fibers that did not provide informative spectra using a given laser were usually detected using another wavelength. For any colored acrylic, the 633-nm laser did not provide Raman information. The 514-nm laser provided the highest discrimination for blue and black cotton, but half of the blue cottons produced noninformative spectra. The 830-nm laser exhibited the highest discrimination for red cotton. Both visible lasers provided the highest discrimination for black and blue wool, and NIR lasers produced remarkable separation for red and black wool. This study shows that the discriminating ability of Raman spectroscopy depends on the fiber type, color, and the laser wavelength.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The intensity of parasite infections often increases during the reproductive season of the host as a result of parasite reproduction, increased parasite transmission and increased host susceptibility. We report within-individual variation in immune parameters, hematocrit and body mass in adult house martins Delichon urbica rearing nestlings in nests experimentally infested with house martin bugs Oeciacus hirundinis and birds rearing nestlings in initially parasite-free nests. From first to second broods body mass and hematocrit of breeding adult house martins decreased. In contrast leucocytes and immunoglobulins became more abundant. When their nests were infested with ectoparasites adults lost more weight compared with birds raising nestlings in nests treated with pyrethrin, whereas the decrease in hematocrit was more pronounced during infection with blood parasites. Neither experimental infestation with house martin bugs nor blood parasites had a significant effect on the amount of immune defences.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The study tests the hypothesis that a low daily fat intake may induce a negative fat balance and impair catch-up growth in stunted children between 3 and 9y of age. DESIGN: Randomized case-control study. SETTING: Three rural villages of the West Kiang District, The Gambia. SUBJECTS: Three groups of 30 stunted but not wasted children (height for age z-score < or = -2.0, weight for height z-score > or = -2.0) 3-9 y of age were selected by anthropometric survey. Groups were matched for age, sex, village, degree of stunting and season. INTERVENTION: Two groups were randomly assigned to be supplemented five days a week for one year with either a high fat (n = 29) or a high carbohydrate biscuit (n = 30) each containing approximately 1600 kJ. The third group was a non supplemented control group (n = 29). Growth, nutritional status, dietary intake, resting energy expenditure and morbidity were compared. RESULTS: Neither the high fat nor the high carbohydrate supplement had an effect on weight or height gain. The high fat supplement did slightly increase adipose tissue mass. There was no effect of supplementation on resting energy expenditure or morbidity. In addition, the annual growth rate was not associated with a morbidity score. CONCLUSIONS: Results show that neither a high fat nor a high carbohydrate supplement given during 12 months to stunted Gambian children induced catch-up growth. The authors suggest that an adverse effect of the environment on catch-up growth persists despite the nutritional interventions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aim: Expression of IL-7R discriminates alloreactive CD4 T cells (Foxp3 negative), from IL-7Rlow regulatory CD4 T cells (Foxp3 positive). Chronic hepatitis C virus infection (HCV) reduces expression of IL-7R on T cells thus promoting persistence of infection. The aim of this study was to analyze the effect of HCV infection on the expression of IL-7R of activated CD4+ T cells in liver transplant patients. Patients and methods: We analyzed PBMC from liver transplant recipients for the expression of CD4, CD25, FoxP3, IL-7R (24 HCV negative and 29 HCV-chronically infected). We compared these data with non-transplanted individuals (52 HCV-chronically infected patients and 38 healthy donors). Results: In HCV-infected liver transplant recipients, levels of CD4+CD25+CD45RO+IL-7R+ T cells were significantly reduced (10.5+/-0.9%) when compared to non-HCV-infected liver transplant recipients (17.6+/-1.4%) (P<0.001), while both groups (HCV-infected and negative transplant recipients) had significantly higher levels than healthy individuals (6.6+/-0.9%) (P<0.0001). After successful antiviral therapy (sustained antiviral response), 6 HCV-infected transplant recipients showed an increase of CD4+CD25+CD45RO+IL-7R+ T cells, reaching levels similar to that of non-HCVinfected recipients (10.73+/-2.63% prior therapy versus 21.7+/-6.3% after clearance of HCV). (P<0.05) In 4 non-responders (i.e. HCVRNA remaining present in serum), levels of CD4+CD25+CD45RO+IL-7R+ T cells remained unmodified during and after antiviral treatment (11.8+/- 3.3% versus 11.3+/-3.3% respectively). Conclusions: Overall, these data indicate that CD4+CD25+CD45RO+IL-7R+ T cells appear to be modulated by chronic HCV infection after liver transplantation. Whether lower levels of alloreactive T cells in HCV-infected liver transplant recipients are associated with a tolerogenic profile remains to be studied.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ubiquitination of proteins is a post-translational modification, which decides on the cellular fate of the protein. Addition of ubiquitin moieties to proteins is carried out by the sequential action of three enzymes: E1, ubiquitin-activating enzyme; E2, ubiquitin-conjugating enzyme; and E3, ubiquitin ligase. The TRAF-interacting protein (TRAIP, TRIP, RNF206) functions as Really Interesting New Gene (RING)-type E3 ubiquitin ligase, but its physiological substrates are not yet known. TRAIP was reported to interact with TRAF [tumor necrosis factor (TNF) receptor-associated factors] and the two tumor suppressors CYLD and Syk (spleen tyrosine kinase). Ectopically expressed TRAIP was shown to inhibit nuclear factor-kappa B (NF-κB) signalling. However, recent results suggested a role for TRAIP in biological processes other than NF-κB regulation. Knock-down of TRAIP in human epidermal keratinocytes repressed cellular proliferation and induced a block in the G1/S phase of the cell cycle without affecting NF-κB signalling. TRAIP is necessary for embryonal development as mutations affecting the Drosophila homologue of TRAIP are maternal effect-lethal mutants, and TRAIP knock-out mice die in utero because of aberrant regulation of cell proliferation and apoptosis. These findings underline the tight link between TRAIP and cell proliferation. In this review, we summarize the data on TRAIP and put them into a larger perspective regarding the role of TRAIP in the control of tissue homeostasis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Antiretroviral-therapy has dramatically changed the course of HIV infection and HIV-infected (HIV(+)) individuals are becoming more frequently eligible for solid-organ transplantation. However, only scarce data are available on how immunosuppressive (IS) strategies relate to transplantation outcome and immune function. We determined the impact of transplantation and immune-depleting treatment on CD4+ T-cell counts, HIV-, EBV-, and Cytomegalovirus (CMV)-viral loads and virus-specific T-cell immunity in a 1-year prospective cohort of 27 HIV(+) kidney transplant recipients. While the results show an increasing breadth and magnitude of the herpesvirus-specific cytotoxic T-cell (CTL) response over-time, they also revealed a significant depletion of polyfunctional virus-specific CTL in individuals receiving thymoglobulin as a lymphocyte-depleting treatment. The disappearance of polyfunctional CTL was accompanied by virologic EBV-reactivation events, directly linking the absence of specific polyfunctional CTL to viral reactivation. The data provide first insights into the immune-reserve in HIV+ infected transplant recipients and highlight new immunological effects of thymoglobulin treatment. Long-term studies will be needed to assess the clinical risk associated with thymoglobulin treatment, in particular with regards to EBV-associated lymphoproliferative diseases.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

P fimbriae are proteinaceous appendages on the surface of Escherichia coli bacteria that mediate adherence to uroepithelial cells. E. coli that express P fimbriae account for the majority of ascending urinary tract infections in women with normal urinary tracts. The hypothesis that P fimbriae on uropathic E. coli attach to renal epithelia and may regulate the immune response to establish infection was investigated. The polymeric Ig receptor (pIgR), produced by renal epithelia, transports IgA into the urinary space. Kidney pIgR and urine IgA levels were analyzed in a mouse model of ascending pyelonephritis, using E. coli with (P+) and without (P-) P fimbriae, to determine whether P(+) E. coli regulate epithelial pIgR expression and IgA transport into the urine. (P+) E. coli establish infection and persist to a greater amount than P(-) E. coli. P(+)-infected mice downregulate pIgR mRNA and protein levels compared with P(-)-infected or PBS controls at &gt; or =48 h. The decrease in pIgR was associated with decreased urinary IgA levels in the P(+)-infected group at 48 h. pIgR mRNA and protein also decline in P(+) E. coli-infected LPS-hyporesponsive mice. These studies identify a novel virulence mechanism of E. coli that express P fimbriae. It is proposed that P fimbriae decrease pIgR expression in the kidney and consequently decrease IgA transport into the urinary space. This may explain, in part, how E. coli that bear P fimbriae exploit the immune system of human hosts to establish ascending pyelonephritis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

OBJECTIVE: The study tests the hypothesis that intramodal visual binding is disturbed in schizophrenia and should be detectable in all illness stages as a stable trait marker. METHOD: Three groups of patients (rehospitalized chronic schizophrenic, first admitted schizophrenic and schizotypal patients believed to be suffering from a pre-schizophrenic prodrome) and a group of normal control subjects were tested on three tasks targeting visual 'binding' abilities (Muller-Lyer's illusion and two figure detection tasks) in addition to control parameters such as reaction time, visual selective attention, Raven's test and two conventional cortical tasks of spatial working memory (SWM) and a global local test. RESULTS: Chronic patients had a decreased performance on the binding tests. Unexpectedly, the prodromal group exhibited an enhanced Gestalt extraction on these tests compared both to schizophrenic patients and to healthy subjects. Furthermore, chronic schizophrenia was associated with a poor performance on cortical tests of SWM, global local and on Raven. This association appears to be mediated by or linked to the chronicity of the illness. CONCLUSION: The study confirms a variety of neurocognitive deficits in schizophrenia which, however, in this sample seem to be linked to chronicity of illness. However, certain aspects of visual processing concerned with Gestalt extraction deserve attention as potential vulnerability- or prodrome- indicators. The initial hypothesis of the study is rejected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chronic hepatitis C virus (HCV) infection remains an important health problem, which is associated with deleterious consequences in kidney transplant recipients. Besides hepatic complications, several extrahepatic complications contribute to reduced patient and allograft survival in HCV-infected kidney recipients. However, HCV infection should not be considered as a contraindication for kidney transplantation because patient survival is better with transplantation than on dialysis. Treatment of HCV infection is currently interferon-alpha (IFN-α) based, which has been associated with higher renal allograft rejection rates. Therefore, antiviral treatment before transplantation is preferable. As in the nontransplant setting, IFN-free treatment regimens, because of their greater efficacy and reduced toxicity, currently represent promising and attractive therapeutic options after kidney transplantation as well. However, clinical trials will be required to closely evaluate these regimens in kidney recipients. There is also a need for prospective controlled studies to determine the optimal immunosuppressive regimens after transplantation in HCV-infected recipients. Combined kidney and liver transplantation is required in patients with advanced liver cirrhosis. However, in patients with cleared HCV infection and early cirrhosis without portal hypertension, kidney transplantation alone may be considered. There is some agreement about the use of HCV-positive donors in HCV-infected recipients, although data regarding posttransplant survival rates are controversial.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Increasing the appropriateness of use of upper gastrointestinal (GI) endoscopy is important to improve quality of care while at the same time containing costs. This study explored whether detailed explicit appropriateness criteria significantly improve the diagnostic yield of upper GI endoscopy. METHODS: Consecutive patients referred for upper GI endoscopy at 6 centers (1 university hospital, 2 district hospitals, 3 gastroenterology practices) were prospectively included over a 6-month period. After controlling for disease presentation and patient characteristics, the relationship between the appropriateness of upper GI endoscopy, as assessed by explicit Swiss criteria developed by the RAND/UCLA panel method, and the presence of relevant endoscopic lesions was analyzed. RESULTS: A total of 2088 patients (60% outpatients, 57% men) were included. Analysis was restricted to the 1681 patients referred for diagnostic upper GI endoscopy. Forty-six percent of upper GI endoscopies were judged to be appropriate, 15% uncertain, and 39% inappropriate by the explicit criteria. No cancer was found in upper GI endoscopies judged to be inappropriate. Upper GI endoscopies judged appropriate or uncertain yielded significantly more relevant lesions (60%) than did those judged to be inappropriate (37%; odds ratio 2.6: 95% CI [2.2, 3.2]). In multivariate analyses, the diagnostic yield of upper GI endoscopy was significantly influenced by appropriateness, patient gender and age, treatment setting, and symptoms. CONCLUSIONS: Upper GI endoscopies performed for appropriate indications resulted in detecting significantly more clinically relevant lesions than did those performed for inappropriate indications. In addition, no upper GI endoscopy that resulted in a diagnosis of cancer was judged to be inappropriate. The use of such criteria improves patient selection for upper GI endoscopy and can thus contribute to efforts aimed at enhancing the quality and efficiency of care. (Gastrointest Endosc 2000;52:333-41).