839 resultados para Class-based isolation vs. sharing
Resumo:
The detection of Leishmania spp. in skin lesion aspirates, using a puncture technique, was evaluated in 76 patients with cutaneous leishmaniasis (CL) who were referred to a Leishmaniasis Reference Centre in Brazil. CL was defined based on skin lesions suggestive of the disease and on a positive result of the Montenegro skin test or Giemsa-stained imprints of biopsy fragments. The aspirates were cultured using a vacuum tube device containing culture medium and evaluated for the presence of Leishmania spp. The biphasic medium culture was examined once a week for three weeks. Promastigotes were observed in 53/76 (69.7%) cultures. Stained smears from 60 of the 76 patients were evaluated using PCR-RFLP to detect the conserved minicircle region of Leishmania spp. and to classify the parasite. Of these patients, 45 (75%) showed positive results in aspirate culture and 15 presented negative results. The PCR was positive in 80% (53/60) samples. The PCR-RFLP profile was determined in 49 samples, of which 45 (92%) showed a pattern compatible with Leishmania (Viannia) braziliensis. The aspirate culture is a sensitive and feasible method for diagnosing CL and may be routinely adopted by health services for L. (V.) braziliensis isolation and identification.
Resumo:
Objectives: The study objective was to derive reference pharmacokinetic curves of antiretroviral drugs (ART) based on available population pharmacokinetic (Pop-PK) studies that can be used to optimize therapeutic drug monitoring guided dosage adjustment.¦Methods: A systematic search of Pop-PK studies of 8 ART in adults was performed in PubMed. To simulate reference PK curves, a summary of the PK parameters was obtained for each drug based on meta-analysis approach. Most models used one-compartment model, thus chosen as reference model. Models using bi-exponential disposition were simplified to one-compartment, since the first distribution phase was rapid and not determinant for the description of the terminal elimination phase, mostly relevant for this project. Different absorption were standardized for first-order absorption processes.¦Apparent clearance (CL), apparent volume of distribution of the terminal phase (Vz) and absorption rate constant (ka) and inter-individual variability were pooled into summary mean value, weighted by number of plasma levels; intra-individual variability was weighted by number of individuals in each study.¦Simulations based on summary PK parameters served to construct concentration PK percentiles (NONMEM®).¦Concordance between individual and summary parameters was assessed graphically using Forest-plots. To test robustness, difference in simulated curves based on published and summary parameters was calculated using efavirenz as probe drug.¦Results: CL was readily accessible from all studies. For studies with one-compartment, Vz was central volume of distribution; for two-compartment, Vz was CL/λz. ka was directly used or derived based on the mean absorption time (MAT) for more complicated absorption models, assuming MAT=1/ka.¦The value of CL for each drug was in excellent agreement throughout all Pop-PK models, suggesting that minimal concentration derived from summary models was adequately characterized. The comparison of the concentration vs. time profile for efavirenz between published and summary PK parameters revealed not more than 20% difference. Although our approach appears adequate for estimation of elimination phase, the simplification of absorption phase might lead to small bias shortly after drug intake.¦Conclusions: Simulated reference percentile curves based on such an approach represent a useful tool for interpretating drug concentrations. This Pop-PK meta-analysis approach should be further validated and could be extended to elaborate more sophisticated computerized tool for the Bayesian TDM of ART.
Resumo:
The generation of an antigen-specific T-lymphocyte response is a complex multi-step process. Upon T-cell receptor-mediated recognition of antigen presented by activated dendritic cells, naive T-lymphocytes enter a program of proliferation and differentiation, during the course of which they acquire effector functions and may ultimately become memory T-cells. A major goal of modern immunology is to precisely identify and characterize effector and memory T-cell subpopulations that may be most efficient in disease protection. Sensitive methods are required to address these questions in exceedingly low numbers of antigen-specific lymphocytes recovered from clinical samples, and not manipulated in vitro. We have developed new techniques to dissect immune responses against viral or tumor antigens. These allow the isolation of various subsets of antigen-specific T-cells (with major histocompatibility complex [MHC]-peptide multimers and five-color FACS sorting) and the monitoring of gene expression in individual cells (by five-cell reverse transcription-polymerase chain reaction [RT-PCR]). We can also follow their proliferative life history by flow-fluorescence in situ hybridization (FISH) analysis of average telomere length. Recently, using these tools, we have identified subpopulations of CD8+ T-lymphocytes with distinct proliferative history and partial effector-like properties. Our data suggest that these subsets descend from recently activated T-cells and are committed to become differentiated effector T-lymphocytes.
Resumo:
The effects of artemisinin-based combination therapies (ACTs) on transmission of Plasmodium falciparum were evaluated after a policy change instituting the use of ACTs in an endemic area. P. falciparum gametocyte carriage, sex ratios and inbreeding rates were examined in 2,585 children at presentation with acute falciparum malaria during a 10-year period from 2001-2010. Asexual parasite rates were also evaluated from 2003-2010 in 10,615 children before and after the policy change. Gametocyte carriage declined significantly from 12.4% in 2001 to 3.6% in 2010 (@@χ² for trend = 44.3, p < 0.0001), but sex ratios and inbreeding rates remained unchanged. Additionally, overall parasite rates remained unchanged before and after the policy change (47.2% vs. 45.4%), but these rates declined significantly from 2003-2010 (@@χ² for trend 35.4, p < 0.0001). Chloroquine (CQ) and artemether-lumefantrine (AL) were used as prototype drugs before and after the policy change, respectively. AL significantly shortened the duration of male gametocyte carriage in individual patients after treatment began compared with CQ (log rank statistic = 7.92, p = 0.005). ACTs reduced the rate of gametocyte carriage in children with acute falciparum infections at presentation and shortened the duration of male gametocyte carriage after treatment. However, parasite population sex ratios, inbreeding rates and overall parasite rate were unaffected.
Resumo:
Coxiella burnetii is the agent of Q fever , an emergent worldwide zoonosis of wide clinical spectrum. Although C. burnetii infection is typically associated with acute infection, atypical pneumonia and flu-like symptoms, endocarditis, osteoarticular manifestations and severe disease are possible, especially when the patient has a suppressed immune system; however, these severe complications are typically neglected. This study reports the sequencing of the repetitive element IS1111 of the transposase gene of C. burnetii from blood and bronchoalveolar lavage (BAL) samples from a patient with severe pneumonia following methotrexate therapy, resulting in the molecular diagnosis of Q fever in a patient who had been diagnosed with active seronegative polyarthritis two years earlier. To the best of our knowledge, this represents the first documented case of the isolation of C. burnetii DNA from a BAL sample.
Resumo:
Epidemiological studies have demonstrated that the variability of the clinical response to infection caused by Mycobacterium leprae is associated with host genetic factors. The present study investigated the frequency of human leukocyte antigen (HLA) class II (DRB1) alleles in patients with leprosy from São Luís, Maranhão, Brazil. A case-control study was performed in 85 individuals with leprosy and 85 healthy subjects. All samples were analysed via polymerase chain reaction-sequence specific oligonucleotide probes. The HLA-DRB1*16 allele showed a higher frequency in the group with leprosy [(9.41% vs. 4.12%) odds ratio (OR) = 2.41 95% confidence interval (CI) (0.96-6.08) p = 0.05], whereas the HLA-DRB1*11 allele was less frequent in the group with leprosy [(6.47% vs. 11.76%) OR = 0.51 95% CI (0.23-1.12) p = 0.09]. The frequency of HLA-DRB1* alleles between the control group and leprosy patient subgroups presenting different forms of the disease showed that the HLA-DRB1*16 (16.13% vs. 8.24%, OR = 4.10, CI = 1.27-13.27, p = 0.010) and HLA-DRB1*14 (5% vs. 3.53%, OR = 4.63, CI = 1.00-21.08, p = 0.032) alleles were significantly more frequent in patients with different clinical subtypes of leprosy. The sample size was a limitation in this study. Nevertheless, the results demonstrated the existence of a genetic susceptibility associated with the clinical forms of leprosy. The low frequency of the HLA-DRB1*11 allele should be further studied to investigate the possible protective effect of this allele.
Resumo:
Summary Cancer is a leading cause of morbidity and mortality in Western countries (as an example, colorectal cancer accounts for about 300'000 new cases and 200'000 deaths each year in Europe and in the USA). Despite that many patients with cancer have complete macroscopic clearance of their disease after resection, radiotherapy and/or chemotherapy, many of these patients develop fatal recurrence. Vaccination with immunogenic peptide tumor antigens has shown encouraging progresses in the last decade; immunotherapy might therefore constitute a fourth therapeutic option in the future. We dissect here and critically evaluate the numerous steps of reverse immunology, a forecast procedure to identify antigenic peptides from the sequence of a gene of interest. Bioinformatic algorithms were applied to mine sequence databases for tumor-specific transcripts. A quality assessment of publicly available sequence databanks allowed defining strengths and weaknesses of bioinformatics-based prediction of colon cancer-specific alternative splicing: new splice variants could be identified, however cancer-restricted expression could not be significantly predicted. Other sources of target transcripts were quantitatively investigated by polymerase chain reactions, as cancer-testis genes or reported overexpressed transcripts. Based on the relative expression of a defined set of housekeeping genes in colon cancer tissues, we characterized a precise procedure for accurate normalization and determined a threshold for the definition of significant overexpression of genes in cancers versus normal tissues. Further steps of reverse immunology were applied on a splice variant of the Melan¬A gene. Since it is known that the C-termini of antigenic peptides are directly produced by the proteasome, longer precursor and overlapping peptides encoded by the target sequence were synthesized chemically and digested in vitro with purified proteasome. The resulting fragments were identified by mass spectroscopy to detect cleavage sites. Using this information and based on the available anchor motifs for defined HLA class I molecules, putative antigenic peptides could be predicted. Their relative affinity for HLA molecules was confirmed experimentally with functional competitive binding assays and they were used to search patients' peripheral blood lymphocytes for the presence of specific cytolytic T lymphocytes (CTL). CTL clones specific for a splice variant of Melan-A could be isolated; although they recognized peptide-pulsed cells, they failed to lyse melanoma cells in functional assays of antigen recognition. In the conclusion, we discuss advantages and bottlenecks of reverse immunology and compare the technical aspects of this approach with the more classical procedure of direct immunology, a technique introduced by Boon and colleagues more than 10 years ago to successfully clone tumor antigens.
Resumo:
BACKGROUND Functional brain images such as Single-Photon Emission Computed Tomography (SPECT) and Positron Emission Tomography (PET) have been widely used to guide the clinicians in the Alzheimer's Disease (AD) diagnosis. However, the subjectivity involved in their evaluation has favoured the development of Computer Aided Diagnosis (CAD) Systems. METHODS It is proposed a novel combination of feature extraction techniques to improve the diagnosis of AD. Firstly, Regions of Interest (ROIs) are selected by means of a t-test carried out on 3D Normalised Mean Square Error (NMSE) features restricted to be located within a predefined brain activation mask. In order to address the small sample-size problem, the dimension of the feature space was further reduced by: Large Margin Nearest Neighbours using a rectangular matrix (LMNN-RECT), Principal Component Analysis (PCA) or Partial Least Squares (PLS) (the two latter also analysed with a LMNN transformation). Regarding the classifiers, kernel Support Vector Machines (SVMs) and LMNN using Euclidean, Mahalanobis and Energy-based metrics were compared. RESULTS Several experiments were conducted in order to evaluate the proposed LMNN-based feature extraction algorithms and its benefits as: i) linear transformation of the PLS or PCA reduced data, ii) feature reduction technique, and iii) classifier (with Euclidean, Mahalanobis or Energy-based methodology). The system was evaluated by means of k-fold cross-validation yielding accuracy, sensitivity and specificity values of 92.78%, 91.07% and 95.12% (for SPECT) and 90.67%, 88% and 93.33% (for PET), respectively, when a NMSE-PLS-LMNN feature extraction method was used in combination with a SVM classifier, thus outperforming recently reported baseline methods. CONCLUSIONS All the proposed methods turned out to be a valid solution for the presented problem. One of the advances is the robustness of the LMNN algorithm that not only provides higher separation rate between the classes but it also makes (in combination with NMSE and PLS) this rate variation more stable. In addition, their generalization ability is another advance since several experiments were performed on two image modalities (SPECT and PET).
Resumo:
AIM: The aim of this study was to evaluate a new pedagogical approach in teaching fluid, electrolyte and acid-base pathophysiology in undergraduate students. METHODS: This approach comprises traditional lectures, the study of clinical cases on the web and a final interactive discussion of these cases in the classroom. When on the web, the students are asked to select laboratory tests that seem most appropriate to understand the pathophysiological condition underlying the clinical case. The percentage of students having chosen a given test is made available to the teacher who uses it in an interactive session to stimulate discussion with the whole class of students. The same teacher used the same case studies during 2 consecutive years during the third year of the curriculum. RESULTS: The majority of students answered the questions on the web as requested and evaluated positively their experience with this form of teaching and learning. CONCLUSIONS: Complementing traditional lectures with online case-based studies and interactive group discussions represents, therefore, a simple means to promote the learning and the understanding of complex pathophysiological mechanisms. This simple problem-based approach to teaching and learning may be implemented to cover all fields of medicine.
Resumo:
This paper introduces the evaluation report after fostering a Standard-based Interoperability Framework (SIF) between the Virgen del Rocío University Hospital (VRUH) Haemodialysis (HD) Unit and 5 outsourced HD centres in order to improve integrated care by automatically sharing patients' Electronic Health Record (EHR) and lab test reports. A pre-post study was conducted during fourteen months. The number of lab test reports of both emergency and routine nature regarding to 379 outpatients was computed before and after the integration of the SIF. Before fostering SIF, 19.38 lab tests per patient were shared between VRUH and HD centres, 5.52 of them were of emergency nature while 13.85 were routine. After integrating SIF, 17.98 lab tests per patient were shared, 3.82 of them were of emergency nature while 14.16 were routine. The inclusion of a SIF in the HD Integrated Care Process has led to an average reduction of 1.39 (p=0.775) lab test requests per patient, including a reduction of 1.70 (p=0.084) in those of emergency nature, whereas an increase of 0.31 (p=0.062) was observed in routine lab tests. Fostering this strategy has led to the reduction in emergency lab test requests, which implies a potential improvement of the integrated care.
Resumo:
In a competitive world, the way a firm establishes its organizational arrangements may determine the enhancement of its core competences and the possibility of reaching new markets. Firms that find their skills to be applicable in just one type of market encounter constraints in expanding their markets, and through alliances may find a competitive form of value capture. Hybrid forms of organization appear primarily as an alternative to capturing value and managing joint assets when the market and hierarchy modes do not present any yields for the firm's competitiveness. As a result, this form may present other challenging issues, such as the allocation of rights and principal-agent problems. The biofuel market has presented a strong pattern of changes over the last 10 years. New intra-firm arrangements have appeared as a path to participate or survive among global competition. Given the need for capital to achieve better results, there has been a consistent movement of mergers and acquisitions in the Biofuel sector, especially since the 2008 financial crisis. In 2011 there were five major groups in Brazil with a grinding capacity of more than 15 million tons per year: Raízen (joint venture formed by Cosan and Shell), Louis Dreyfus, Tereos Petrobras, ETH, and Bunge. Major oil companies have implemented the strategy of diversification as a hedge against the rising cost of oil. Using the alliance of Cosan and Shell in the Brazilian biofuel market as a case study, this paper analyses the governance mode and challenging issues raised by strategic alliances when firms aim to reach new markets through the sharing of core competences with local firms. The article is based on documentary research and interviews with Cosan's Investor Relations staff, and examines the main questions involving hybrid forms through the lens of the Transaction Cost Economics (TCE), Agency Theory, Resource Based View (RBV), and dynamic capabilities theoretical approaches. One focal point is knowledge "appropriability" and the specific assets originated by the joint venture. Once the alliance is formed, it is expected that competences will be shared and new capabilities will expand the limits of the firm. In the case studied, Cosan and Shell shared a number of strategic assets related to their competences. Raízen was formed with economizing incentives, as well to continue marshalling internal resources to enhance the company's presence in the world energy sector. Therefore, some challenges might be related to the control and monitoring agents' behavior, considering the two-part organism formed by distinctive organizational culture, tacit knowledge, and long-term incentives. The case study analyzed illustrates the hybrid arrangement as a middle form for organizing the transaction: neither in the market nor in the hierarchy mode, but rather a more flexible commitment agreement with a strategic central authority. The corporate governance devices are also a challenge, since the alignment between the parent companies in the joint ventures is far more complex. These characteristics have led to an organism with bilateral dependence, offering favorable conditions for developing dynamic capabilities. However, these conditions might rely on the partners' long-term interest in the joint venture.
Resumo:
We study the effect of strong heterogeneities on the fracture of disordered materials using a fiber bundle model. The bundle is composed of two subsets of fibers, i.e. a fraction 0 ≤ α ≤ 1 of fibers is unbreakable, while the remaining 1 - α fraction is characterized by a distribution of breaking thresholds. Assuming global load sharing, we show analytically that there exists a critical fraction of the components αc which separates two qualitatively diferent regimes of the system: below αc the burst size distribution is a power law with the usual exponent Ƭ= 5/2, while above αc the exponent switches to a lower value Ƭ = 9/4 and a cutoff function occurs with a diverging characteristic size. Analyzing the macroscopic response of the system we demonstrate that the transition is conditioned to disorder distributions where the constitutive curve has a single maximum and an inflexion point defining a novel universality class of breakdown phenomena
Resumo:
Only half of hypertensive patients has controlled blood pressure. Chronic kidney disease (CKD) is also associated with low blood pressure control, 25-30% of CKD patients achieving adequate blood pressure. The Community Preventive Services Task Force has recently recommended team-based care to improve blood pressure control. Team-based care of hypertension involves facilitating coordination of care among physician, pharmacist and nurse and requires sharing clinical data, laboratory results, and medications, e.g., electronically or by fax. Based on recent studies, development and evaluation of team-based care of hypertensive patients should be done in the Swiss healthcare system.
Resumo:
OBJECTIVES: Laboratory detection of vancomycin-intermediate Staphylococcus aureus (VISA) and their heterogeneous VISA (hVISA) precursors is difficult. Thus, it is possible that vancomycin failures against supposedly vancomycin-susceptible S. aureus are due to undiagnosed VISA or hVISA. We tested this hypothesis in experimental endocarditis.¦METHODS: Rats with aortic valve infection due to the vancomycin-susceptible (MIC 2 mg/L), methicillin-resistant S. aureus M1V2 were treated for 2 days with doses of vancomycin that mimicked the pharmacokinetics seen in humans following intravenous administration of 1 g of the drug every 12 h. Half of the treated animals were killed 8 h after treatment arrest and half 3 days thereafter. Population analyses were done directly on vegetation homogenates or after one subculture in drug-free medium to mimic standard diagnostic procedures.¦RESULTS: Vancomycin cured 14 of 26 animals (54%; P<0.05 versus controls) after 2 days of treatment. When vegetation homogenates were plated directly on vancomycin-containing plates, 6 of 13 rats killed 8 h after treatment arrest had positive cultures, 1 of which harboured hVISA. Likewise, 6 of 13 rats killed 3 days thereafter had positive valve cultures, 5 of which harboured hVISA. However, one subculture of vegetations in drug-free broth was enough to revert all the hVISA phenotypes to the susceptible pattern of the parent. Thus, vancomycin selected for hVISA during therapy of experimental endocarditis due to vancomycin-susceptible S. aureus. These hVISA were associated with vancomycin failure. The hVISA phenotype persisted in vivo, even after vancomycin arrest, but was missed in vitro after a single passage of the vegetation homogenate on drug-free medium.¦CONCLUSIONS: hVISA might escape detection in clinical samples if they are subcultured before susceptibility tests.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.