988 resultados para Current-source inverter (CSI)
Resumo:
Drug-resistance and therapy failure due to drug-drug interactions are the main challenges in current treatment against Human Immunodeficiency Virus (HIV) infection. As such, there is a continuous need for the development of new and more potent anti-HIV drugs. Here we established a high-throughput screen based on the highly permissive TZM-bl cell line to identify novel HIV inhibitors. The assay allows discriminating compounds acting on early and/or late steps of the HIV replication cycle. The platform was used to screen a unique library of secondary metabolites derived from myxobacteria. Several hits with good anti-HIV profiles were identified. Five of the initial hits were tested for their antiviral potency. Four myxobacterial compounds, sulfangolid C, soraphen F, epothilon D and spirangien B, showed EC50 values in the nM range with SI > 15. Interestingly, we found a high amount of overlapping hits compared with a previous screen for Hepatitis C Virus (HCV) using the same library. The unique structures and mode-of-actions of these natural compounds make myxobacteria an attractive source of chemicals for the development of broad-spectrum antivirals. Further biological and structural studies of our initial hits might help recognize smaller drug-like derivatives that in turn could be synthesized and further optimized.
Resumo:
Community-level patterns of functional traits relate to community assembly and ecosystem functioning. By modelling the changes of different indices describing such patterns - trait means, extremes and diversity in communities - as a function of abiotic gradients, we could understand their drivers and build projections of the impact of global change on the functional components of biodiversity. We used five plant functional traits (vegetative height, specific leaf area, leaf dry matter content, leaf nitrogen content and seed mass) and non-woody vegetation plots to model several indices depicting community-level patterns of functional traits from a set of abiotic environmental variables (topographic, climatic and edaphic) over contrasting environmental conditions in a mountainous landscape. We performed a variation partitioning analysis to assess the relative importance of these variables for predicting patterns of functional traits in communities, and projected the best models under several climate change scenarios to examine future potential changes in vegetation functional properties. Not all indices of trait patterns within communities could be modelled with the same level of accuracy: the models for mean and extreme values of functional traits provided substantially better predictive accuracy than the models calibrated for diversity indices. Topographic and climatic factors were more important predictors of functional trait patterns within communities than edaphic predictors. Overall, model projections forecast an increase in mean vegetation height and in mean specific leaf area following climate warming. This trend was important at mid elevation particularly between 1000 and 2000 m asl. With this study we showed that topographic, climatic and edaphic variables can successfully model descriptors of community-level patterns of plant functional traits such as mean and extreme trait values. However, which factors determine the diversity of functional traits in plant communities remains unclear and requires more investigations.
Resumo:
AbstractBACKGROUND: Scientists have been trying to understand the molecular mechanisms of diseases to design preventive and therapeutic strategies for a long time. For some diseases, it has become evident that it is not enough to obtain a catalogue of the disease-related genes but to uncover how disruptions of molecular networks in the cell give rise to disease phenotypes. Moreover, with the unprecedented wealth of information available, even obtaining such catalogue is extremely difficult.PRINCIPAL FINDINGS: We developed a comprehensive gene-disease association database by integrating associations from several sources that cover different biomedical aspects of diseases. In particular, we focus on the current knowledge of human genetic diseases including mendelian, complex and environmental diseases. To assess the concept of modularity of human diseases, we performed a systematic study of the emergent properties of human gene-disease networks by means of network topology and functional annotation analysis. The results indicate a highly shared genetic origin of human diseases and show that for most diseases, including mendelian, complex and environmental diseases, functional modules exist. Moreover, a core set of biological pathways is found to be associated with most human diseases. We obtained similar results when studying clusters of diseases, suggesting that related diseases might arise due to dysfunction of common biological processes in the cell.CONCLUSIONS: For the first time, we include mendelian, complex and environmental diseases in an integrated gene-disease association database and show that the concept of modularity applies for all of them. We furthermore provide a functional analysis of disease-related modules providing important new biological insights, which might not be discovered when considering each of the gene-disease association repositories independently. Hence, we present a suitable framework for the study of how genetic and environmental factors, such as drugs, contribute to diseases.AVAILABILITY: The gene-disease networks used in this study and part of the analysis are available at http://ibi.imim.es/DisGeNET/DisGeNETweb.html#Download
Resumo:
In past years, comprehensive representations of cell signalling pathways have been developed by manual curation from literature, which requires huge effort and would benefit from information stored in databases and from automatic retrieval and integration methods. Once a reconstruction of the network of interactions is achieved, analysis of its structural features and its dynamic behaviour can take place. Mathematical modelling techniques are used to simulate the complex behaviour of cell signalling networks, which ultimately sheds light on the mechanisms leading to complex diseases or helps in the identification of drug targets. A variety of databases containing information on cell signalling pathways have been developed in conjunction with methodologies to access and analyse the data. In principle, the scenario is prepared to make the most of this information for the analysis of the dynamics of signalling pathways. However, are the knowledge repositories of signalling pathways ready to realize the systems biology promise? In this article we aim to initiate this discussion and to provide some insights on this issue.
Resumo:
Gene set enrichment (GSE) analysis is a popular framework for condensing information from gene expression profiles into a pathway or signature summary. The strengths of this approach over single gene analysis include noise and dimension reduction, as well as greater biological interpretability. As molecular profiling experiments move beyond simple case-control studies, robust and flexible GSE methodologies are needed that can model pathway activity within highly heterogeneous data sets. To address this challenge, we introduce Gene Set Variation Analysis (GSVA), a GSE method that estimates variation of pathway activity over a sample population in an unsupervised manner. We demonstrate the robustness of GSVA in a comparison with current state of the art sample-wise enrichment methods. Further, we provide examples of its utility in differential pathway activity and survival analysis. Lastly, we show how GSVA works analogously with data from both microarray and RNA-seq experiments. GSVA provides increased power to detect subtle pathway activity changes over a sample population in comparison to corresponding methods. While GSE methods are generally regarded as end points of a bioinformatic analysis, GSVA constitutes a starting point to build pathway-centric models of biology. Moreover, GSVA contributes to the current need of GSE methods for RNA-seq data. GSVA is an open source software package for R which forms part of the Bioconductor project and can be downloaded at http://www.bioconductor.org.
Resumo:
The treatment of writer's cramp, a task-specific focal hand dystonia, needs new approaches. A deficiency of inhibition in the motor cortex might cause writer's cramp. Transcranial direct current stimulation modulates cortical excitability and may provide a therapeutic alternative. In this randomized, double-blind, sham-controlled study, we investigated the efficacy of cathodal stimulation of the contralateral motor cortex in 3 sessions in 1 week. Assessment over a 2-week period included clinical scales, subjective ratings, kinematic handwriting analysis, and neurophysiological evaluation. Twelve patients with unilateral dystonic writer's cramp were investigated; 6 received transcranial direct current and 6 sham stimulation. Cathodal transcranial direct current stimulation had no favorable effects on clinical scales and failed to restore normal handwriting kinematics and cortical inhibition. Subjective worsening remained unexplained, leading to premature study termination. Repeated sessions of cathodal transcranial direct current stimulation of the motor cortex yielded no favorable results supporting a therapeutic potential in writer's cramp.
Resumo:
Locally advanced prostate cancer (LAPC) is a heterogeneous entity usually embracing T3-4 and/or pelvic lymph-node-positive disease in the absence of established metastases. Outcomes for LAPC with single therapies have traditionally been poor, leading to the investigation of adjuvant therapies. Prostate cancer is a hormonally sensitive tumour, which usually responds to pharmacological manipulation of the androgen receptor or its testosterone-related ligands. As such, androgen deprivation therapy (ADT) has become an important adjuvant strategy for the treatment of LAPC, particularly for patients managed primarily with radiotherapy. Such results have generally not been replicated in surgical patients. With increased use of ADT has come improved awareness of the numerous toxicities associated with long-term use of these agents, as well as the development of strategies for minimizing ADT exposure and actively managing adverse effects. Several trials are exploring agents to enhance radiation cell sensitivity as well as the application of adjuvant docetaxel, an agent with proven efficacy in the metastatic, castrate-resistant setting. The recent work showing activity of cabazitaxel, sipuleucel-T and abiraterone for castrate-resistant disease in the post-docetaxel setting will see these agents investigated in conjunction with definitive surgery and radiotherapy.
Resumo:
The motivation for this research initiated from the abrupt rise and fall of minicomputers which were initially used both for industrial automation and business applications due to their significantly lower cost than their predecessors, the mainframes. Later industrial automation developed its own vertically integrated hardware and software to address the application needs of uninterrupted operations, real-time control and resilience to harsh environmental conditions. This has led to the creation of an independent industry, namely industrial automation used in PLC, DCS, SCADA and robot control systems. This industry employs today over 200'000 people in a profitable slow clockspeed context in contrast to the two mainstream computing industries of information technology (IT) focused on business applications and telecommunications focused on communications networks and hand-held devices. Already in 1990s it was foreseen that IT and communication would merge into one Information and communication industry (ICT). The fundamental question of the thesis is: Could industrial automation leverage a common technology platform with the newly formed ICT industry? Computer systems dominated by complex instruction set computers (CISC) were challenged during 1990s with higher performance reduced instruction set computers (RISC). RISC started to evolve parallel to the constant advancement of Moore's law. These developments created the high performance and low energy consumption System-on-Chip architecture (SoC). Unlike to the CISC processors RISC processor architecture is a separate industry from the RISC chip manufacturing industry. It also has several hardware independent software platforms consisting of integrated operating system, development environment, user interface and application market which enables customers to have more choices due to hardware independent real time capable software applications. An architecture disruption merged and the smartphone and tablet market were formed with new rules and new key players in the ICT industry. Today there are more RISC computer systems running Linux (or other Unix variants) than any other computer system. The astonishing rise of SoC based technologies and related software platforms in smartphones created in unit terms the largest installed base ever seen in the history of computers and is now being further extended by tablets. An underlying additional element of this transition is the increasing role of open source technologies both in software and hardware. This has driven the microprocessor based personal computer industry with few dominating closed operating system platforms into a steep decline. A significant factor in this process has been the separation of processor architecture and processor chip production and operating systems and application development platforms merger into integrated software platforms with proprietary application markets. Furthermore the pay-by-click marketing has changed the way applications development is compensated: Three essays on major trends in a slow clockspeed industry: The case of industrial automation 2014 freeware, ad based or licensed - all at a lower price and used by a wider customer base than ever before. Moreover, the concept of software maintenance contract is very remote in the app world. However, as a slow clockspeed industry, industrial automation has remained intact during the disruptions based on SoC and related software platforms in the ICT industries. Industrial automation incumbents continue to supply systems based on vertically integrated systems consisting of proprietary software and proprietary mainly microprocessor based hardware. They enjoy admirable profitability levels on a very narrow customer base due to strong technology-enabled customer lock-in and customers' high risk leverage as their production is dependent on fault-free operation of the industrial automation systems. When will this balance of power be disrupted? The thesis suggests how industrial automation could join the mainstream ICT industry and create an information, communication and automation (ICAT) industry. Lately the Internet of Things (loT) and weightless networks, a new standard leveraging frequency channels earlier occupied by TV broadcasting, have gradually started to change the rigid world of Machine to Machine (M2M) interaction. It is foreseeable that enough momentum will be created that the industrial automation market will in due course face an architecture disruption empowered by these new trends. This thesis examines the current state of industrial automation subject to the competition between the incumbents firstly through a research on cost competitiveness efforts in captive outsourcing of engineering, research and development and secondly researching process re- engineering in the case of complex system global software support. Thirdly we investigate the industry actors', namely customers, incumbents and newcomers, views on the future direction of industrial automation and conclude with our assessments of the possible routes industrial automation could advance taking into account the looming rise of the Internet of Things (loT) and weightless networks. Industrial automation is an industry dominated by a handful of global players each of them focusing on maintaining their own proprietary solutions. The rise of de facto standards like IBM PC, Unix and Linux and SoC leveraged by IBM, Compaq, Dell, HP, ARM, Apple, Google, Samsung and others have created new markets of personal computers, smartphone and tablets and will eventually also impact industrial automation through game changing commoditization and related control point and business model changes. This trend will inevitably continue, but the transition to a commoditized industrial automation will not happen in the near future.
Resumo:
During the past decades, anticancer immunotherapy has evolved from a promising therapeutic option to a robust clinical reality. Many immunotherapeutic regimens are now approved by the US Food and Drug Administration and the European Medicines Agency for use in cancer patients, and many others are being investigated as standalone therapeutic interventions or combined with conventional treatments in clinical studies. Immunotherapies may be subdivided into "passive" and "active" based on their ability to engage the host immune system against cancer. Since the anticancer activity of most passive immunotherapeutics (including tumor-targeting monoclonal antibodies) also relies on the host immune system, this classification does not properly reflect the complexity of the drug-host-tumor interaction. Alternatively, anticancer immunotherapeutics can be classified according to their antigen specificity. While some immunotherapies specifically target one (or a few) defined tumor-associated antigen(s), others operate in a relatively non-specific manner and boost natural or therapy-elicited anticancer immune responses of unknown and often broad specificity. Here, we propose a critical, integrated classification of anticancer immunotherapies and discuss the clinical relevance of these approaches.
Resumo:
T his report presents population estimates for July 1, 1972 and provisional estimates for July 1, 1973, for counties and metropolitan areas prepared under the auspices of the Federal State Cooperative Program for Local Population Estimates. The objective of this program. is the development and publication of State - prepared estimates of the population of counties using uniform procedures largely standardized for data input and methodology. The methods used have been mutually agreed upon by the individual States and the Bureau of the Census on the basis of a test of methods against the 1970 census. For a more detailed description of the program.
Resumo:
This report presents population estimates for July 1, 1973 and provisional estimates for July 1, for counties and metropolitan areas prepared under the auspices of the Federal State Cooperative Program for Local Population Estimates. The objective of this program. is the development and publication of State - prepared estimates of the population of counties using uniform procedures largely standardized for data input and methodology. The methods used have been mutually agreed upon by the individual States and the Bureau of the Census on the basis of a test of methods against the 1970 census. For a more detailed description of the program.
Resumo:
An analytical model based on Bowen and Holman [1989] is used to prove the existence of instabilities due to the presence of a second extremum of the background vorticity at the front side of the longshore current. The growth rate of the so-called frontshear waves depends primarily upon the frontshear but also upon the backshear and the maximum and the width of the current. Depending on the values of these parameters, either the frontshear or the backshear instabilities may dominate. Both types of waves have a cross-shore extension of the order of the width of the current, but the frontshear modes are localized closer to the coast than are the backshear modes. Moreover, under certain conditions both unstable waves have similar growth rates with close wave numbers and angular frequencies, leading to the possibility of having modulated shear waves in the alongshore direction. Numerical analysis performed on realistic current profiles confirm the behavior anticipated by the analytical model. The theory has been applied to a current profile fitted to data measured during the 1980 Nearshore Sediment Transport Studies experiment at Leadbetter Beach that has an extremum of background vorticity at the front side of the current. In this case and in agreement with field observations, the model predicts instability, whereas the theory based only on backshear instability fai led to do so.