977 resultados para Applied Load
Resumo:
Reverse transcriptase (RT) is a multifunctional enzyme in the human immunodeficiency virus (HIV)-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs) and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.
Resumo:
Background: Although combination antiretroviral therapy (cART) dramatically reduces rates of AIDS and death, a minority of patients experience clinical disease progression during treatment. <p>Objective: To investigate whether detection of CXCR4(X4)-specific strains or quantification of X4-specific HIV-1 load predict clinical outcome. Methods: From the Swiss HIV Cohort Study, 96 participants who initiated cART yet subsequently progressed to AIDS or death were compared with 84 contemporaneous, treated nonprogressors. A sensitive heteroduplex tracking assay was developed to quantify plasma X4 and CCR5 variants and resolve HIV-1 load into coreceptor-specific components. Measurements were analyzed as cofactors of progression in multivariable Cox models adjusted for concurrent CD4 cell count and total viral load, applying inverse probability weights to adjust for sampling bias. Results: Patients with X4 variants at baseline displayed reduced CD4 cell responses compared with those without X4 strains (40 versus 82 cells/mu l; P= 0.012). The adjusted multivariable hazard ratio (HR) for clinical progression was 4.8 [95% confidence interval (Cl) 2.3-10.0] for those demonstrating X4 strains at baseline. The X4-specific HIV-1 load was a similarly independent predictor, with HR values of 3.7(95%Cl, 1.2-11.3) and 5.9 (95% Cl, 2.2-15.0) for baseline loads of 2.2-4.3 and > 4.3 log(10)copies/ml, respectively, compared with < 2.2 log(10)copies/ml. Conclusions: HIV-1 coreceptor usage and X4-specific viral loads strongly predicted disease progression during cART, independent of and in addition to CD4 cell count or total viral load. Detection and quantification of X4 strains promise to be clinically useful biomarkers to guide patient management and study HIV-1 pathogenesis.
Resumo:
During the Early Toarcian, major paleoenvironnemental and paleoceanographical changes occurred, leading to an oceanic anoxic event (OAE) and to a perturbation of the carbon isotope cycle. Although the standard biochronology of the Lower Jurassic is essentially based upon ammonites, in recent years biostratigraphy based on calcareous nannofossils and dinoflagellate cysts is increasingly used to date Jurassic rocks. However, the precise dating and correlation of the Early Toarcian OAE, and of the associated delta C-13 anomaly in different settings of the western Tethys, are still partly problematic, and it is still unclear whether these events are synchronous or not. In order to allow more accurate correlations of the organic rich levels recorded in the Lower Toarcian OAE, this account proposes a new biozonation based on a quantitative biochronology approach, the Unitary Associations (UA), applied to calcareous nannofossils. This study represents the first attempt to apply the UA method to Jurassic nannofossils. The study incorporates eighteen sections distributed across western Tethys and ranging from the Pliensbachian to Aalenian, comprising 1220 samples and 72 calcareous nannofossil taxa. The BioGraph [Savary, J., Guex, J., 1999. Discrete biochronological scales and unitary associations: description of the Biograph Computer program. Memoires de Geologie de Lausanne 34, 282 pp] and UA-Graph (Copyright Hammer O., Guex and Savary, 2002) softwares provide a discrete biochronological framework based upon multi-taxa concurrent range zones in the different sections. The optimized dataset generates nine UAs using the co-occurrences of 56 taxa. These UAs are grouped into six Unitary Association Zones (UA-Z), which constitute a robust biostratigraphic synthesis of all the observed or deduced biostratigraphic relationships between the analysed taxa. The UA zonation proposed here is compared to ``classic'' calcareous nannofossil biozonations, which are commonly used for the southern and the northern sides of Tethys. The biostratigraphic resolution of the UA-Zones varies from one nannofossil subzone or part of it to several subzones, and can be related to the pattern of calcareous nannoplankton originations and extinctions during the studied time interval. The Late Pliensbachian - Early Toarcian interval (corresponding to the UA-Z II) represents a major step in the Jurassic nannoplankton radiation. The recognized UA-Zones are also compared to the carbon isotopic negative excursion and TOC maximum in five sections of central Italy, Germany and England, with the aim of providing a more reliable correlation tool for the Early Toarcian OAE, and of the associated isotopic anomaly, between the southern and northern part of western Tethys. The results of this work show that the TOC maximum and delta C-13 negative excursion correspond to the upper part of the UA-Z II (i.e., UA 3) in the sections analysed. This suggests that the Early Toarcian OAE was a synchronous event within the western Tethys. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
O objetivo neste artigo é investigar a trajetória de duas empresas startups brasileiras dedicadas a pesquisa e desenvolvimento (P&D) no setor de biotecnologia: a Alellyx e a CanaVialis. São dois casos de spin-offs acadêmicos e de bioemprendimentos germinados na esfera do Projeto Genoma, da Fundação de Amparo à Pesquisa do Estado de São Paulo (Fapesp), maturadas na Votorantim Novos Negócios (VNN), área de novos negócios de um dos maiores grupos industriais brasileiros que atua no segmento de commodities, o Grupo Votorantim S/A, e depois vendidas para a Monsanto. Neste estudo, tentou-se compreender a racionalidade e as virtudes das ações de investimentos corporativos e das políticas públicas destinadas à biotecnologia focada em genômica aplicada para a agricultura no Brasil. A metodologia utilizada é a de estudo de caso, mais especificamente de análise dos dois casos de empresas comentados acima. Os resultados demonstraram que, apesar de o principal objetivo do grupo econômico ser a célere valorização do capital investido e seu retorno financeiro, a afiliação corporativa dessas empresas estimulou a aceleração de um conjunto de capacitações para a gestão empresarial da Alellyx e da CanaVialis, que foram críticas para o amadurecimento do negócio. Evidenciou-se, ainda, que foi fundamental o significativo aporte de recursos por meio dos mecanismos de apoio do sistema nacional à inovação.
Resumo:
Miralls deformables més i més grans, amb cada cop més actuadors estan sent utilitzats actualment en aplicacions d'òptica adaptativa. El control dels miralls amb centenars d'actuadors és un tema de gran interès, ja que les tècniques de control clàssiques basades en la seudoinversa de la matriu de control del sistema es tornen massa lentes quan es tracta de matrius de dimensions tan grans. En aquesta tesi doctoral es proposa un mètode per l'acceleració i la paral.lelitzacó dels algoritmes de control d'aquests miralls, a través de l'aplicació d'una tècnica de control basada en la reducció a zero del components més petits de la matriu de control (sparsification), seguida de l'optimització de l'ordenació dels accionadors de comandament atenent d'acord a la forma de la matriu, i finalment de la seva posterior divisió en petits blocs tridiagonals. Aquests blocs són molt més petits i més fàcils de fer servir en els càlculs, el que permet velocitats de càlcul molt superiors per l'eliminació dels components nuls en la matriu de control. A més, aquest enfocament permet la paral.lelització del càlcul, donant una com0onent de velocitat addicional al sistema. Fins i tot sense paral. lelització, s'ha obtingut un augment de gairebé un 40% de la velocitat de convergència dels miralls amb només 37 actuadors, mitjançant la tècnica proposada. Per validar això, s'ha implementat un muntatge experimental nou complet , que inclou un modulador de fase programable per a la generació de turbulència mitjançant pantalles de fase, i s'ha desenvolupat un model complert del bucle de control per investigar el rendiment de l'algorisme proposat. Els resultats, tant en la simulació com experimentalment, mostren l'equivalència total en els valors de desviació després de la compensació dels diferents tipus d'aberracions per als diferents algoritmes utilitzats, encara que el mètode proposat aquí permet una càrrega computacional molt menor. El procediment s'espera que sigui molt exitós quan s'aplica a miralls molt grans.
Resumo:
To perform a climatic analysis of the annual UV index (UVI) variations in Catalonia, Spain (northeast of the Iberian Peninsula), a new simple parameterization scheme is presented based on a multilayer radiative transfer model. The parameterization performs fast UVI calculations for a wide range of cloudless and snow-free situations and can be applied anywhere. The following parameters are considered: solar zenith angle, total ozone column, altitude, aerosol optical depth, and single-scattering albedo. A sensitivity analysis is presented to justify this choice with special attention to aerosol information. Comparisons with the base model show good agreement, most of all for the most common cases, giving an absolute error within 0.2 in the UVI for a wide range of cases considered. Two tests are done to show the performance of the parameterization against UVI measurements. One uses data from a high-quality spectroradiometer from Lauder, New Zealand [45.04°S, 169.684°E, 370 m above mean sea level (MSL)], where there is a low presence of aerosols. The other uses data from a Robertson–Berger-type meter from Girona, Spain (41.97°N, 2.82°E, 100 m MSL), where there is more aerosol load and where it has been possible to study the effect of aerosol information on the model versus measurement comparison. The parameterization is applied to a climatic analysis of the annual UVI variation in Catalonia, showing the contributions of solar zenith angle, ozone, and aerosols. High-resolution seasonal maps of typical UV index values in Catalonia are presented
Resumo:
PURPOSE: To compare the efficacy of antibiotic drops placed in the conjunctival cul-de-sac to antibiotic ointment applied to the lid margin in reduction of bacterial colonization on the lid margin. METHODS: A randomized, prospective, single-masked study was conducted on 19 patients with culture-proven colonization of bacteria on the lid margins. Ophthalmic eligibility criteria included the presence of > or =50 colony-forming units/mL (CFU/mL) of bacteria on both right and left lids. Each patient received one drop of ofloxacin in one eye every night for one week, followed by one drop once a week for one month. In the same manner, each patient received bacitracin ointment (erythromycin or gentamicin ointment if lid margin bacteria were resistant to bacitracin) to the lid margin of the fellow eye. Quantitative lid cultures were taken at initial visit, one week, one month, and two months. Fifteen volunteers (30 lids) served as controls. Lid cultures were taken at initial visit, one week, and one month. RESULTS: Both antibiotic drop and ointment reduced average bacterial CFU/mL at one week and one month. Average bacterial CFU/mL reestablished to baseline values at two months. There was no statistically significant difference between antibiotic drop and ointment in reducing bacterial colonization on the lid margin. CONCLUSION: Antibiotic drops placed in the conjunctival cul-de-sac appear to be as effective as ointment applied to the lid margins in reducing bacterial colonization in patients with > or =50 CFU/mL of bacteria on the lid margins.
Resumo:
The development of model observers for mimicking human detection strategies has followed from symmetric signals in simple noise to increasingly complex backgrounds. In this study we implement different model observers for the complex task of detecting a signal in a 3D image stack. The backgrounds come from real breast tomosynthesis acquisitions and the signals were simulated and reconstructed within the volume. Two different tasks relevant to the early detection of breast cancer were considered: detecting an 8 mm mass and detecting a cluster of microcalcifications. The model observers were calculated using a channelized Hotelling observer (CHO) with dense difference-of-Gaussian channels, and a modified (Partial prewhitening [PPW]) observer which was adapted to realistic signals which are not circularly symmetric. The sustained temporal sensitivity function was used to filter the images before applying the spatial templates. For a frame rate of five frames per second, the only CHO that we calculated performed worse than the humans in a 4-AFC experiment. The other observers were variations of PPW and outperformed human observers in every single case. This initial frame rate was a rather low speed and the temporal filtering did not affect the results compared to a data set with no human temporal effects taken into account. We subsequently investigated two higher speeds at 5, 15 and 30 frames per second. We observed that for large masses, the two types of model observers investigated outperformed the human observers and would be suitable with the appropriate addition of internal noise. However, for microcalcifications both only the PPW observer consistently outperformed the humans. The study demonstrated the possibility of using a model observer which takes into account the temporal effects of scrolling through an image stack while being able to effectively detect a range of mass sizes and distributions.
Resumo:
Over thirty years ago, Leamer (1983) - among many others - expressed doubts about the quality and usefulness of empirical analyses for the economic profession by stating that "hardly anyone takes data analyses seriously. Or perhaps more accurately, hardly anyone takes anyone else's data analyses seriously" (p.37). Improvements in data quality, more robust estimation methods and the evolution of better research designs seem to make that assertion no longer justifiable (see Angrist and Pischke (2010) for a recent response to Leamer's essay). The economic profes- sion and policy makers alike often rely on empirical evidence as a means to investigate policy relevant questions. The approach of using scientifically rigorous and systematic evidence to identify policies and programs that are capable of improving policy-relevant outcomes is known under the increasingly popular notion of evidence-based policy. Evidence-based economic policy often relies on randomized or quasi-natural experiments in order to identify causal effects of policies. These can require relatively strong assumptions or raise concerns of external validity. In the context of this thesis, potential concerns are for example endogeneity of policy reforms with respect to the business cycle in the first chapter, the trade-off between precision and bias in the regression-discontinuity setting in chapter 2 or non-representativeness of the sample due to self-selection in chapter 3. While the identification strategies are very useful to gain insights into the causal effects of specific policy questions, transforming the evidence into concrete policy conclusions can be challenging. Policy develop- ment should therefore rely on the systematic evidence of a whole body of research on a specific policy question rather than on a single analysis. In this sense, this thesis cannot and should not be viewed as a comprehensive analysis of specific policy issues but rather as a first step towards a better understanding of certain aspects of a policy question. The thesis applies new and innovative identification strategies to policy-relevant and topical questions in the fields of labor economics and behavioral environmental economics. Each chapter relies on a different identification strategy. In the first chapter, we employ a difference- in-differences approach to exploit the quasi-experimental change in the entitlement of the max- imum unemployment benefit duration to identify the medium-run effects of reduced benefit durations on post-unemployment outcomes. Shortening benefit duration carries a double- dividend: It generates fiscal benefits without deteriorating the quality of job-matches. On the contrary, shortened benefit durations improve medium-run earnings and employment possibly through containing the negative effects of skill depreciation or stigmatization. While the first chapter provides only indirect evidence on the underlying behavioral channels, in the second chapter I develop a novel approach that allows to learn about the relative impor- tance of the two key margins of job search - reservation wage choice and search effort. In the framework of a standard non-stationary job search model, I show how the exit rate from un- employment can be decomposed in a way that is informative on reservation wage movements over the unemployment spell. The empirical analysis relies on a sharp discontinuity in unem- ployment benefit entitlement, which can be exploited in a regression-discontinuity approach to identify the effects of extended benefit durations on unemployment and survivor functions. I find evidence that calls for an important role of reservation wage choices for job search be- havior. This can have direct implications for the optimal design of unemployment insurance policies. The third chapter - while thematically detached from the other chapters - addresses one of the major policy challenges of the 21st century: climate change and resource consumption. Many governments have recently put energy efficiency on top of their agendas. While pricing instru- ments aimed at regulating the energy demand have often been found to be short-lived and difficult to enforce politically, the focus of energy conservation programs has shifted towards behavioral approaches - such as provision of information or social norm feedback. The third chapter describes a randomized controlled field experiment in which we discuss the effective- ness of different types of feedback on residential electricity consumption. We find that detailed and real-time feedback caused persistent electricity reductions on the order of 3 to 5 % of daily electricity consumption. Also social norm information can generate substantial electricity sav- ings when designed appropriately. The findings suggest that behavioral approaches constitute effective and relatively cheap way of improving residential energy-efficiency.
Resumo:
ABSTRACT: Ultramarathons comprise any sporting event involving running longer than the traditional marathon length of 42.195 km (26.2 miles). Studies on ultramarathon participants can investigate the acute consequences of ultra-endurance exercise on inflammation and cardiovascular or renal consequences, as well as endocrine/energetic aspects, and examine the tissue recovery process over several days of extreme physical load. In a study published in BMC Medicine, Schütz et al. followed 44 ultramarathon runners over 4,487 km from South Italy to North Cape, Norway (the Trans Europe Foot Race 2009) and recorded daily sets of data from magnetic resonance imaging, psychometric, body composition and biological measurements. The findings will allow us to better understand the timecourse of degeneration/regeneration of some lower leg tissues such as knee joint cartilage, to differentiate running-induced from age-induced pathologies (for example, retropatelar arthritis) and finally to assess the interindividual susceptibility to injuries. Moreover, it will also provide new information about the complex interplay between cerebral adaptations/alterations and hormonal influences resulting from endurance exercise and provide data on the dose-response relationship between exercise and brain structure/function. Overall, this study represents a unique attempt to investigate the limits of the adaptive response of human bodies.Please see related article: http://www.biomedcentral.com/1741-7015/10/78.
The evolution of XY recombination: sexually antagonistic selection versus deleterious mutation load.
Resumo:
Recombination arrest between X and Y chromosomes, driven by sexually antagonistic genes, is expected to induce their progressive differentiation. However, in contrast to birds and mammals (which display the predicted pattern), most cold-blooded vertebrates have homomorphic sex chromosomes. Two main hypotheses have been proposed to account for this, namely high turnover rates of sex-determining systems and occasional XY recombination. Using individual-based simulations, we formalize the evolution of XY recombination (here mediated by sex reversal; the "fountain-of-youth" model) under the contrasting forces of sexually antagonistic selection and deleterious mutations. The shift between the domains of elimination and accumulation occurs at much lower selection coefficients for the Y than for the X. In the absence of dosage compensation, mildly deleterious mutations accumulating on the Y depress male fitness, thereby providing incentives for XY recombination. Under our settings, this occurs via "demasculinization" of the Y, allowing recombination in XY (sex-reversed) females. As we also show, this generates a conflict with the X, which coevolves to oppose sex reversal. The resulting rare events of XY sex reversal are enough to purge the Y from its load of deleterious mutations. Our results support the "fountain of youth" as a plausible mechanism to account for the maintenance of sex-chromosome homomorphy.
Resumo:
The application of the Fry method to measure strain in deformed porphyritic granites is discussed. This method requires that the distribution of markers has to satisfy at least two conditions. It has to be homogeneous and isotropic. Statistics on point distribution with the help of a Morishita diagram can easily test homogeneity. Isotropy can be checked with a cumulative histogram of angles between points. Application of these tests to undeformed (Mte Capanne granite, Elba) and to deformed (Randa orthogneiss, Alps of Switzerland) porphyritic granite reveals that their K-feldspars phenocrysts both satisfy these conditions and can be used as strain markers with the Fry method. Other problems are also examined. One is the possible distribution of deformation on discrete shear-bands. Providing several tests are met, we conclude that the Fry method can be used to estimate strain in deformed porphyritic granites. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
Over the years, bridge engineers have been concerned about the response of prestressed concrete (PC) girder bridges that had been hit by over-height vehicles or vehicle loads. When a bridge is struck by an over-height vehicle or vehicle load, usually the outside and in some instances one of the interior girders are damaged in a bridge. The effect of intermediate diaphragms in providing damage protection to the PC girders of a bridge is not clearly defined. This analytical study focused on the role of intermediate diaphragms in reducing the occurrence of damage in the girders of a PC-girder bridge that has been struck by an over-height vehicle or vehicle load. The study also investigated whether a steel, intermediate diaphragm would essentially provide the same degree of impact protection for PC girders as that provided by a reinforced-concrete diaphragm. This investigation includes the following: a literature search and a survey questionnaire to determine the state-of-the-art in the use and design of intermediate diaphragms in PC-girder bridges. Comparisons were made between the strain and displacement results that were experimentally measured for a large-scale, laboratory, model bridge during previously documented work and those results that were obtained from analyses of the finite-element models that were developed during this research for that bridge. These comparisons were conducted to calibrate the finite element models used in the analyses for this research on intermediate diaphragms. Finite-element models were developed for non-skewed and skewed PC-girder bridges. Each model was analyzed with either a reinforced concrete or two types of steel, intermediate diaphragms that were located at mid-span of an interior span for a PC-girder bridge. The bridge models were analyzed for lateral-impact loads that were applied to the bottom flange of the exterior girders at the diaphragms location and away from the diaphragms location. A comparison was conducted between the strains and displacements induced in the girders for each intermediate-diaphragm type. These results showed that intermediate diaphragms have an effect in reducing impact damage to the PC girders. When the lateral impact-load was applied at the diaphragm location, the reinforced-concrete diaphragms provided more protection for the girders than that provided by the two types of steel diaphragms. The three types of diaphragms provided essentially the same degree of protection to the impacted, PC girder when the lateral-impact load was applied away from the diaphragm location.
Resumo:
We present a new technique for audio signal comparison based on tonal subsequence alignment and its application to detect cover versions (i.e., different performances of the same underlying musical piece). Cover song identification is a task whose popularity has increased in the Music Information Retrieval (MIR) community along in the past, as it provides a direct and objective way to evaluate music similarity algorithms.This article first presents a series of experiments carried outwith two state-of-the-art methods for cover song identification.We have studied several components of these (such as chroma resolution and similarity, transposition, beat tracking or Dynamic Time Warping constraints), in order to discover which characteristics would be desirable for a competitive cover song identifier. After analyzing many cross-validated results, the importance of these characteristics is discussed, and the best-performing ones are finally applied to the newly proposed method. Multipleevaluations of this one confirm a large increase in identificationaccuracy when comparing it with alternative state-of-the-artapproaches.