880 resultados para valuation of new technology-based start ups
Developmental Brain Dysfunction: Revival and Expansion of Old Concepts Based on New Genetic Evidence
Resumo:
Neurodevelopmental disorders can be caused by many different genetic abnormalities that are individually rare but collectively common. Specific genetic causes, including certain copy number variants and single-gene mutations, are shared among disorders that are thought to be clinically distinct. This evidence of variability in the clinical manifestations of individual genetic variants and sharing of genetic causes among clinically distinct brain disorders is consistent with the concept of developmental brain dysfunction, a term we use to describe the abnormal brain function underlying a group of neurodevelopmental and neuropsychiatric disorders and to encompass a subset of various clinical diagnoses. Although many pathogenic genetic variants are currently thought to be variably penetrant, we hypothesise that when disorders encompassed by developmental brain dysfunction are considered as a group, the penetrance will approach 100%. The penetrance is also predicted to approach 100% when the phenotype being considered is a specific trait, such as intelligence or autistic-like social impairment, and the trait could be assessed using a continuous, quantitative measure to compare probands with non-carrier family members rather than a qualitative, dichotomous trait and comparing probands with the healthy population. Copyright 2013 Elsevier Ltd. All rights reserved.
Resumo:
Epothilones are macrocyclic bacterial natural products with potent microtubule-stabilizing and antiproliferative activity. They have served as successful lead structures for the development of several clinical candidates for anticancer therapy. However, the structural diversity of this group of clinical compounds is rather limited, as their structures show little divergence from the original natural product leads. Our own research has explored the question of whether epothilones can serve as a basis for the development of new structural scaffolds, or chemotypes, for microtubule stabilization that might serve as a basis for the discovery of new generations of anticancer drugs. We have elaborated a series of epothilone-derived macrolactones whose overall structural features significantly deviate from those of the natural epothilone scaffold and thus define new structural families of microtubule-stabilizing agents. Key elements of our hypermodification strategy are the change of the natural epoxide geometry from cis to trans, the incorporation of a conformationally constrained side chain, the removal of the C3-hydroxyl group, and the replacement of C12 with nitrogen. So far, this approach has yielded analogs 30 and 40 that are the most advanced, the most rigorously modified, structures, both of which are potent antiproliferative agents with low nanomolar activity against several human cancer cell lines in vitro. The synthesis was achieved through a macrolactone-based strategy or a high-yielding RCM reaction. The 12-aza-epothilone ("azathilone" 40) may be considered a "non-natural" natural product that still retains most of the overall structural characteristics of a true natural product but is structurally unique, because it lies outside of the general scope of Nature's biosynthetic machinery for polyketide synthesis. Like natural epothilones, both 30 and 40 promote tubulin polymerization in vitro and at the cellular level induce cell cycle arrest in mitosis. These facts indicate that cancer cell growth inhibition by these compounds is based on the same mechanistic underpinnings as those for natural epothilones. Interestingly, the 9,10-dehydro analog of 40 is significantly less active than the saturated parent compound, which is contrary to observations for natural epothilones B or D. This may point to differences in the bioactive conformations of N-acyl-12-aza-epothilones like 40 and natural epothilones. In light of their distinct structural features, combined with an epothilone-like (and taxol-like) in vitro biological profile, 30 and 40 can be considered as representative examples of new chemotypes for microtubule stabilization. As such, they may offer the same potential for pharmacological differentiation from the original epothilone leads as various newly discovered microtubule-stabilizing natural products with macrolactone structures, such as laulimalide, peloruside, or dictyostatin.
Resumo:
During the past decade microbeam radiation therapy has evolved from preclinical studies to a stage in which clinical trials can be planned, using spatially fractionated, highly collimated and high intensity beams like those generated at the x-ray ID17 beamline of the European Synchrotron Radiation Facility. The production of such microbeams typically between 25 and 100 microm full width at half maximum (FWHM) values and 100-400 microm center-to-center (c-t-c) spacings requires a multislit collimator either with fixed or adjustable microbeam width. The mechanical regularity of such devices is the most important property required to produce an array of identical microbeams. That ensures treatment reproducibility and reliable use of Monte Carlo-based treatment planning systems. New high precision wire cutting techniques allow the fabrication of these collimators made of tungsten carbide. We present a variable slit width collimator as well as a single slit device with a fixed setting of 50 microm FWHM and 400 microm c-t-c, both able to cover irradiation fields of 50 mm width, deemed to meet clinical requirements. Important improvements have reduced the standard deviation of 5.5 microm to less than 1 microm for a nominal FWHM value of 25 microm. The specifications of both devices, the methods used to measure these characteristics, and the results are presented.
Resumo:
A system for screening of nutritional risk is described. It is based on the concept that nutritional support is indicated in patients who are severely ill with increased nutritional requirements, or who are severely undernourished, or who have certain degrees of severity of disease in combination with certain degrees of undernutrition. Degrees of severity of disease and undernutrition were defined as absent, mild, moderate or severe from data sets in a selected number of randomized controlled trials (RCTs) and converted to a numeric score. After completion, the screening system was validated against all published RCTs known to us of nutritional support vs spontaneous intake to investigate whether the screening system could distinguish between trials with a positive outcome and trials with no effect on outcome.
Resumo:
A close to native structure of bulk biological specimens can be imaged by cryo-electron microscopy of vitreous sections (CEMOVIS). In some cases structural information can be combined with X-ray data leading to atomic resolution in situ. However, CEMOVIS is not routinely used. The two critical steps consist of producing a frozen section ribbon of a few millimeters in length and transferring the ribbon onto an electron microscopy grid. During these steps, the first sections of the ribbon are wrapped around an eyelash (unwrapping is frequent). When a ribbon is sufficiently attached to the eyelash, the operator must guide the nascent ribbon. Steady hands are required. Shaking or overstretching may break the ribbon. In turn, the ribbon immediately wraps around itself or flies away and thereby becomes unusable. Micromanipulators for eyelashes and grids as well as ionizers to attach section ribbons to grids were proposed. The rate of successful ribbon collection, however, remained low for most operators. Here we present a setup composed of two micromanipulators. One of the micromanipulators guides an electrically conductive fiber to which the ribbon sticks with unprecedented efficiency in comparison to a not conductive eyelash. The second micromanipulator positions the grid beneath the newly formed section ribbon and with the help of an ionizer the ribbon is attached to the grid. Although manipulations are greatly facilitated, sectioning artifacts remain but the likelihood to investigate high quality sections is significantly increased due to the large number of sections that can be produced with the reported tool.
Resumo:
BACKGROUND Reducing the fraction of transmissions during recent human immunodeficiency virus (HIV) infection is essential for the population-level success of "treatment as prevention". METHODS A phylogenetic tree was constructed with 19 604 Swiss sequences and 90 994 non-Swiss background sequences. Swiss transmission pairs were identified using 104 combinations of genetic distance (1%-2.5%) and bootstrap (50%-100%) thresholds, to examine the effect of those criteria. Monophyletic pairs were classified as recent or chronic transmission based on the time interval between estimated seroconversion dates. Logistic regression with adjustment for clinical and demographic characteristics was used to identify risk factors associated with transmission during recent or chronic infection. FINDINGS Seroconversion dates were estimated for 4079 patients on the phylogeny, and comprised between 71 (distance, 1%; bootstrap, 100%) to 378 transmission pairs (distance, 2.5%; bootstrap, 50%). We found that 43.7% (range, 41%-56%) of the transmissions occurred during the first year of infection. Stricter phylogenetic definition of transmission pairs was associated with higher recent-phase transmission fraction. Chronic-phase viral load area under the curve (adjusted odds ratio, 3; 95% confidence interval, 1.64-5.48) and time to antiretroviral therapy (ART) start (adjusted odds ratio 1.4/y; 1.11-1.77) were associated with chronic-phase transmission as opposed to recent transmission. Importantly, at least 14% of the chronic-phase transmission events occurred after the transmitter had interrupted ART. CONCLUSIONS We demonstrate a high fraction of transmission during recent HIV infection but also chronic transmissions after interruption of ART in Switzerland. Both represent key issues for treatment as prevention and underline the importance of early diagnosis and of early and continuous treatment.
EPANET Input Files of New York tunnels and Pacific City used in a metamodel-based optimization study
Resumo:
Metamodels have proven be very useful when it comes to reducing the computational requirements of Evolutionary Algorithm-based optimization by acting as quick-solving surrogates for slow-solving fitness functions. The relationship between metamodel scope and objective function varies between applications, that is, in some cases the metamodel acts as a surrogate for the whole fitness function, whereas in other cases it replaces only a component of the fitness function. This paper presents a formalized qualitative process to evaluate a fitness function to determine the most suitable metamodel scope so as to increase the likelihood of calibrating a high-fidelity metamodel and hence obtain good optimization results in a reasonable amount of time. The process is applied to the risk-based optimization of water distribution systems; a very computationally-intensive problem for real-world systems. The process is validated with a simple case study (modified New York Tunnels) and the power of metamodelling is demonstrated on a real-world case study (Pacific City) with a computational speed-up of several orders of magnitude.
Resumo:
The Bounty Trough, east of New Zealand, lies along the southeastern edge of the present-day Subtropical Front (STF), and is a major conduit via the Bounty Channel, for terrigenous sediment supply from the uplifted Southern Alps to the abyssal Bounty Fan. Census data on 65 benthic foraminiferal faunas (>63 µm) from upper bathyal (ODP 1119), lower bathyal (DSDP 594) and abyssal (ODP 1122) sequences, test and refine existing models for the paleoceanographic and sedimentary history of the trough through the last 150 ka (marine isotope stages, MIS 6-1). Cluster analysis allows recognition of six species groups, whose distribution patterns coincide with bathymetry, the climate cycles and displaced turbidite beds. Detrended canonical correspondence analysis and comparisons with modern faunal patterns suggest that the groups are most strongly influenced by food supply (organic carbon flux), and to a lesser extent by bottom water oxygen and factors relating to sediment type. Major faunal changes at upper bathyal depths (1119) probably resulted from cycles of counter-intuitive seaward-landward migrations of the Southland Front (SF) (north-south sector of the STF). Benthic foraminiferal changes suggest that lower nutrient, cool Subantarctic Surface Water (SAW) was overhead in warm intervals, and higher nutrient-bearing, warm neritic Subtropical Surface Water (STW) was overhead in cold intervals. At lower bathyal depths (594), foraminiferal changes indicate increased glacial productivity and lowered bottom oxygen, attributed to increased upwelling and inflow of cold, nutrient-rich, Antarctic Intermediate Water (AAIW) and shallowing of the oxygen-minimum zone (upper Circum Polar Deep Water, CPDW). The observed cyclical benthic foraminiferal changes are not a result of associations migrating up and down the slope, as glacial faunas (dominated by Globocassidulina canalisuturata and Eilohedra levicula at upper and lower bathyal depths, respectively) are markedly different from those currently living in the Bounty Trough. On the abyssal Bounty Fan (1122), faunal changes correlate most strongly with grain size, and are attributed to varying amounts of mixing of displaced and in-situ faunas. Most of the displaced foraminifera in turbiditic sand beds are sourced from mid-outer shelf depths at the head of the Bounty Channel. Turbidity currents were more prevalent during, but not restricted to, glacial intervals.
Resumo:
The new user cold start issue represents a serious problem in recommender systems as it can lead to the loss of new users who decide to stop using the system due to the lack of accuracy in the recommenda- tions received in that first stage in which they have not yet cast a significant number of votes with which to feed the recommender system?s collaborative filtering core. For this reason it is particularly important to design new similarity metrics which provide greater precision in the results offered to users who have cast few votes. This paper presents a new similarity measure perfected using optimization based on neu- ral learning, which exceeds the best results obtained with current metrics. The metric has been tested on the Netflix and Movielens databases, obtaining important improvements in the measures of accuracy, precision and recall when applied to new user cold start situations. The paper includes the mathematical formalization describing how to obtain the main quality measures of a recommender system using leave- one-out cross validation.
Resumo:
Advances in electronics nowadays facilitate the design of smart spaces based on physical mash-ups of sensor and actuator devices. At the same time, software paradigms such as Internet of Things (IoT) and Web of Things (WoT) are motivating the creation of technology to support the development and deployment of web-enabled embedded sensor and actuator devices with two major objectives: (i) to integrate sensing and actuating functionalities into everyday objects, and (ii) to easily allow a diversity of devices to plug into the Internet. Currently, developers who are applying this Internet-oriented approach need to have solid understanding about specific platforms and web technologies. In order to alleviate this development process, this research proposes a Resource-Oriented and Ontology-Driven Development (ROOD) methodology based on the Model Driven Architecture (MDA). This methodology aims at enabling the development of smart spaces through a set of modeling tools and semantic technologies that support the definition of the smart space and the automatic generation of code at hardware level. ROOD feasibility is demonstrated by building an adaptive health monitoring service for a Smart Gym.
Resumo:
Envelope Tracking (ET) and Envelope Elimination and Restoration (EER) are two techniques that have been used as a solution for highly efficient linear RF Power Amplifiers (PA). In both techniques the most important part is a dc-dc converter called envelope amplifier that has to supply the RF PA with variable voltage. Besides high efficiency, its bandwidth is very important as well. Envelope amplifier based on parallel combination of a switching dc-dc converter and a linear regulator is an architecture that is widely used due to its simplicity. In this paper we discuss about theoretical limitations of this architecture regarding its efficiency and we demonstrate two possible way of its implementation. In order to derive the presented conclusions, a theoretical model of envelope amplifier's efficiency has been presented. Additionally, the benefits of the new emerging GaN technology for this application have been shown as well.
Resumo:
La presente tesis doctoral contribuye al problema del diagnóstico autonómico de fallos en redes de telecomunicación. En las redes de telecomunicación actuales, las operadoras realizan tareas de diagnóstico de forma manual. Dichas operaciones deben ser llevadas a cabo por ingenieros altamente cualificados que cada vez tienen más dificultades a la hora de gestionar debidamente el crecimiento exponencial de la red tanto en tamaño, complejidad y heterogeneidad. Además, el advenimiento del Internet del Futuro hace que la demanda de sistemas que simplifiquen y automaticen la gestión de las redes de telecomunicación se haya incrementado en los últimos años. Para extraer el conocimiento necesario para desarrollar las soluciones propuestas y facilitar su adopción por los operadores de red, se propone una metodología de pruebas de aceptación para sistemas multi-agente enfocada en simplificar la comunicación entre los diferentes grupos de trabajo involucrados en todo proyecto de desarrollo software: clientes y desarrolladores. Para contribuir a la solución del problema del diagnóstico autonómico de fallos, se propone una arquitectura de agente capaz de diagnosticar fallos en redes de telecomunicación de manera autónoma. Dicha arquitectura extiende el modelo de agente Belief-Desire- Intention (BDI) con diferentes modelos de diagnóstico que gestionan las diferentes sub-tareas del proceso. La arquitectura propuesta combina diferentes técnicas de razonamiento para alcanzar su propósito gracias a un modelo estructural de la red, que usa razonamiento basado en ontologías, y un modelo causal de fallos, que usa razonamiento Bayesiano para gestionar debidamente la incertidumbre del proceso de diagnóstico. Para asegurar la adecuación de la arquitectura propuesta en situaciones de gran complejidad y heterogeneidad, se propone un marco de argumentación que permite diagnosticar a agentes que estén ejecutando en dominios federados. Para la aplicación de este marco en un sistema multi-agente, se propone un protocolo de coordinación en el que los agentes dialogan hasta alcanzar una conclusión para un caso de diagnóstico concreto. Como trabajos futuros, se consideran la extensión de la arquitectura para abordar otros problemas de gestión como el auto-descubrimiento o la auto-optimización, el uso de técnicas de reputación dentro del marco de argumentación para mejorar la extensibilidad del sistema de diagnóstico en entornos federados y la aplicación de las arquitecturas propuestas en las arquitecturas de red emergentes, como SDN, que ofrecen mayor capacidad de interacción con la red. ABSTRACT This PhD thesis contributes to the problem of autonomic fault diagnosis of telecommunication networks. Nowadays, in telecommunication networks, operators perform manual diagnosis tasks. Those operations must be carried out by high skilled network engineers which have increasing difficulties to properly manage the growing of those networks, both in size, complexity and heterogeneity. Moreover, the advent of the Future Internet makes the demand of solutions which simplifies and automates the telecommunication network management has been increased in recent years. To collect the domain knowledge required to developed the proposed solutions and to simplify its adoption by the operators, an agile testing methodology is defined for multiagent systems. This methodology is focused on the communication gap between the different work groups involved in any software development project, stakeholders and developers. To contribute to overcoming the problem of autonomic fault diagnosis, an agent architecture for fault diagnosis of telecommunication networks is defined. That architecture extends the Belief-Desire-Intention (BDI) agent model with different diagnostic models which handle the different subtasks of the process. The proposed architecture combines different reasoning techniques to achieve its objective using a structural model of the network, which uses ontology-based reasoning, and a causal model, which uses Bayesian reasoning to properly handle the uncertainty of the diagnosis process. To ensure the suitability of the proposed architecture in complex and heterogeneous environments, an argumentation framework is defined. This framework allows agents to perform fault diagnosis in federated domains. To apply this framework in a multi-agent system, a coordination protocol is defined. This protocol is used by agents to dialogue until a reliable conclusion for a specific diagnosis case is reached. Future work comprises the further extension of the agent architecture to approach other managements problems, such as self-discovery or self-optimisation; the application of reputation techniques in the argumentation framework to improve the extensibility of the diagnostic system in federated domains; and the application of the proposed agent architecture in emergent networking architectures, such as SDN, which offers new capabilities of control for the network.