999 resultados para Ramsay, Anders


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molecular machinery on the micro-scale, believed to be the fundamental building blocks of life, involve forces of 1-100 pN and movements of nanometers to micrometers. Micromechanical single-molecule experiments seek to understand the physics of nucleic acids, molecular motors, and other biological systems through direct measurement of forces and displacements. Optical tweezers are a popular choice among several complementary techniques for sensitive force-spectroscopy in the field of single molecule biology. The main objective of this thesis was to design and construct an optical tweezers instrument capable of investigating the physics of molecular motors and mechanisms of protein/nucleic-acid interactions on the single-molecule level. A double-trap optical tweezers instrument incorporating acousto-optic trap-steering, two independent detection channels, and a real-time digital controller was built. A numerical simulation and a theoretical study was performed to assess the signal-to-noise ratio in a constant-force molecular motor stepping experiment. Real-time feedback control of optical tweezers was explored in three studies. Position-clamping was implemented and compared to theoretical models using both proportional and predictive control. A force-clamp was implemented and tested with a DNA-tether in presence of the enzyme lambda exonuclease. The results of the study indicate that the presented models describing signal-to-noise ratio in constant-force experiments and feedback control experiments in optical tweezers agree well with experimental data. The effective trap stiffness can be increased by an order of magnitude using the presented position-clamping method. The force-clamp can be used for constant-force experiments, and the results from a proof-of-principle experiment, in which the enzyme lambda exonuclease converts double-stranded DNA to single-stranded DNA, agree with previous research. The main objective of the thesis was thus achieved. The developed instrument and presented results on feedback control serve as a stepping stone for future contributions to the growing field of single molecule biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Branding, as any other concept, has evolved over time: from the days when sheep of one herd started to be branded to distinguish them from another herd to the current era when everything, from water and flowers to clothes and food, is branded. Throughout these times, there have been numerous theories to describe and understand the underlying nuances. This paper finds the relationships in previous literature and reveals how these theories see branding from various perspectives and how they can be integrated to form a coherent view. It is also discussed how branding and society affect each other. Based on the knowledge of how branding theories have been developed as dependent variables of each other and the society, we are able to form a better understanding of the past, the present, and the future of branding.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Modern sample surveys started to spread after statistician at the U.S. Bureau of the Census in the 1940s had developed a sampling design for the Current Population Survey (CPS). A significant factor was also that digital computers became available for statisticians. In the beginning of 1950s, the theory was documented in textbooks on survey sampling. This thesis is about the development of the statistical inference for sample surveys. For the first time the idea of statistical inference was enunciated by a French scientist, P. S. Laplace. In 1781, he published a plan for a partial investigation in which he determined the sample size needed to reach the desired accuracy in estimation. The plan was based on Laplace s Principle of Inverse Probability and on his derivation of the Central Limit Theorem. They were published in a memoir in 1774 which is one of the origins of statistical inference. Laplace s inference model was based on Bernoulli trials and binominal probabilities. He assumed that populations were changing constantly. It was depicted by assuming a priori distributions for parameters. Laplace s inference model dominated statistical thinking for a century. Sample selection in Laplace s investigations was purposive. In 1894 in the International Statistical Institute meeting, Norwegian Anders Kiaer presented the idea of the Representative Method to draw samples. Its idea was that the sample would be a miniature of the population. It is still prevailing. The virtues of random sampling were known but practical problems of sample selection and data collection hindered its use. Arhtur Bowley realized the potentials of Kiaer s method and in the beginning of the 20th century carried out several surveys in the UK. He also developed the theory of statistical inference for finite populations. It was based on Laplace s inference model. R. A. Fisher contributions in the 1920 s constitute a watershed in the statistical science He revolutionized the theory of statistics. In addition, he introduced a new statistical inference model which is still the prevailing paradigm. The essential idea is to draw repeatedly samples from the same population and the assumption that population parameters are constants. Fisher s theory did not include a priori probabilities. Jerzy Neyman adopted Fisher s inference model and applied it to finite populations with the difference that Neyman s inference model does not include any assumptions of the distributions of the study variables. Applying Fisher s fiducial argument he developed the theory for confidence intervals. Neyman s last contribution to survey sampling presented a theory for double sampling. This gave the central idea for statisticians at the U.S. Census Bureau to develop the complex survey design for the CPS. Important criterion was to have a method in which the costs of data collection were acceptable, and which provided approximately equal interviewer workloads, besides sufficient accuracy in estimation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single molecule force clamp experiments are widely used to investigate how enzymes, molecular motors, and other molecular mechanisms work. We developed a dual-trap optical tweezers instrument with real-time (200 kHz update rate) force clamp control that can exert 0–100 pN forces on trapped beads. A model for force clamp experiments in the dumbbell-geometry is presented. We observe good agreement between predicted and observed power spectra of bead position and force fluctuations. The model can be used to predict and optimize the dynamics of real-time force clamp optical tweezers instruments. The results from a proof-of-principle experiment in which lambda exonuclease converts a double-stranded DNA tether, held at constant tension, into its single-stranded form, show that the developed instrument is suitable for experiments in single molecule biology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Background Pubertal timing is a strongly heritable trait, but no single puberty gene has been identified. Thus, the genetic background of idiopathic central precocious puberty (ICPP) is poorly understood. Overall, the genetic modulation of pubertal onset most likely arises from the additive effect of multiple genes, but also monogenic causes of ICPP probably exist, as cases of familial ICPP have been reported. Mutations in KISS1 and KISSR, coding for kisspeptin and its receptor, involved in GnRH secretion and puberty onset, have been suggested causative for monogenic ICPP. Variation in LIN28B was associated with timing of puberty in genome-wide association (GWA) studies. LIN28B is a human ortholog of the gene that controls, through microRNAs, developmental timing in C. elegans. In addition, Lin28a transgenic mice manifest the puberty phenotypes identified in the human GWAS. Thus, both LIN28B and LIN28A may have a role in pubertal development and are good candidate genes for monogenic ICPP. Methods Thirty girls with ICPP were included in the study. ICPP was defined by pubertal onset before 8 yrs of age, and a pubertal LH response to GnRH testing. The coding regions of LIN28B, LIN28A, KISS1, and KISS1R were sequenced. The missense change in LIN28B was also screened in 132 control subjects. Results No rare variants were detected in KISS1 or KISS1R in the 30 subjects with ICPP. In LIN28B, one missense change, His199Arg, was found in one subject with ICPP. However, this variant was also detected in one of the 132 controls. No variation in LIN28A was found. Conclusions We did not find any evidence that mutations in LIN28B or LIN28A would underlie ICPP. In addition, we confirmed that mutations in KISS1 and KISS1R are not a common cause for ICPP.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: India has the third largest HIV-1 epidemic with 2.4 million infected individuals. Molecular epidemiological analysis has identified the predominance of HIV-1 subtype C (HIV-1C). However, the previous reports have been limited by sample size, and uneven geographical distribution. The introduction of HIV-1C in India remains uncertain due to this lack of structured studies. To fill the gap, we characterised the distribution pattern of HIV-1 subtypes in India based on data collection from nationwide clinical cohorts between 2007 and 2011. We also reconstructed the time to the most recent common ancestor (tMRCA) of the predominant HIV-1C strains. Methodology/Principal Findings: Blood samples were collected from 168 HIV-1 seropositive subjects from 7 different states. HIV-1 subtypes were determined using two or three genes, gag, pol, and env using several methods. Bayesian coalescent-based approach was used to reconstruct the time of introduction and population growth patterns of the Indian HIV-1C. For the first time, a high prevalence (10%) of unique recombinant forms (BC and A1C) was observed when two or three genes were used instead of one gene (p<0.01; p = 0.02, respectively). The tMRCA of Indian HIV-1C was estimated using the three viral genes, ranged from 1967 (gag) to 1974 (env). Pol-gene analysis was considered to provide the most reliable estimate 1971, (95% CI: 1965-1976)]. The population growth pattern revealed an initial slow growth phase in the mid-1970s, an exponential phase through the 1980s, and a stationary phase since the early 1990s. Conclusions/Significance: The Indian HIV-1C epidemic originated around 40 years ago from a single or few genetically related African lineages, and since then largely evolved independently. The effective population size in the country has been broadly stable since the 1990s. The evolving viral epidemic, as indicated by the increase of recombinant strains, warrants a need for continued molecular surveillance to guide efficient disease intervention strategies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Lovasz θ function of a graph, is a fundamental tool in combinatorial optimization and approximation algorithms. Computing θ involves solving a SDP and is extremely expensive even for moderately sized graphs. In this paper we establish that the Lovasz θ function is equivalent to a kernel learning problem related to one class SVM. This interesting connection opens up many opportunities bridging graph theoretic algorithms and machine learning. We show that there exist graphs, which we call SVM−θ graphs, on which the Lovasz θ function can be approximated well by a one-class SVM. This leads to a novel use of SVM techniques to solve algorithmic problems in large graphs e.g. identifying a planted clique of size Θ(n√) in a random graph G(n,12). A classic approach for this problem involves computing the θ function, however it is not scalable due to SDP computation. We show that the random graph with a planted clique is an example of SVM−θ graph, and as a consequence a SVM based approach easily identifies the clique in large graphs and is competitive with the state-of-the-art. Further, we introduce the notion of a ''common orthogonal labeling'' which extends the notion of a ''orthogonal labelling of a single graph (used in defining the θ function) to multiple graphs. The problem of finding the optimal common orthogonal labelling is cast as a Multiple Kernel Learning problem and is used to identify a large common dense region in multiple graphs. The proposed algorithm achieves an order of magnitude scalability compared to the state of the art.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we establish that the Lovasz theta function on a graph can be restated as a kernel learning problem. We introduce the notion of SVM-theta graphs, on which Lovasz theta function can be approximated well by a Support vector machine (SVM). We show that Erdos-Renyi random G(n, p) graphs are SVM-theta graphs for log(4)n/n <= p < 1. Even if we embed a large clique of size Theta(root np/1-p) in a G(n, p) graph the resultant graph still remains a SVM-theta graph. This immediately suggests an SVM based algorithm for recovering a large planted clique in random graphs. Associated with the theta function is the notion of orthogonal labellings. We introduce common orthogonal labellings which extends the idea of orthogonal labellings to multiple graphs. This allows us to propose a Multiple Kernel learning (MKL) based solution which is capable of identifying a large common dense subgraph in multiple graphs. Both in the planted clique case and common subgraph detection problem the proposed solutions beat the state of the art by an order of magnitude.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The base (BOP) and the top (TOP) of the world income pyramid represent the poor people and the people from developed countries, respectively. The design of products for the BOP is an important ingredient of the poverty reduction approach that combines business development with poverty alleviation. However, the current understanding of the design for the BOP is limited. This study, using a protocol analysis, compared design processes for the BOP and TOP markets. The results indicate the difference between the design processes for these markets in terms of the design strategy employed by the designers (i.e. problem driven, solution driven strategy), their requirements handling behaviour, and their information behaviour. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Bioenergy deployment offers significant potential for climate change mitigation, but also carries considerable risks. In this review, we bring together perspectives of various communities involved in the research and regulation of bioenergy deployment in the context of climate change mitigation: Land-use and energy experts, land-use and integrated assessment modelers, human geographers, ecosystem researchers, climate scientists and two different strands of life-cycle assessment experts. We summarize technological options, outline the state-of-the-art knowledge on various climate effects, provide an update on estimates of technical resource potential and comprehensively identify sustainability effects. Cellulosic feedstocks, increased end-use efficiency, improved land carbon-stock management and residue use, and, when fully developed, BECCS appear as the most promising options, depending on development costs, implementation, learning, and risk management. Combined heat and power, efficient biomass cookstoves and small-scale power generation for rural areas can help to promote energy access and sustainable development, along with reduced emissions. We estimate the sustainable technical potential as up to 100EJ: high agreement; 100-300EJ: medium agreement; above 300EJ: low agreement. Stabilization scenarios indicate that bioenergy may supply from 10 to 245EJyr(-1) to global primary energy supply by 2050. Models indicate that, if technological and governance preconditions are met, large-scale deployment (>200EJ), together with BECCS, could help to keep global warming below 2 degrees degrees of preindustrial levels; but such high deployment of land-intensive bioenergy feedstocks could also lead to detrimental climate effects, negatively impact ecosystems, biodiversity and livelihoods. The integration of bioenergy systems into agriculture and forest landscapes can improve land and water use efficiency and help address concerns about environmental impacts. We conclude that the high variability in pathways, uncertainties in technological development and ambiguity in political decision render forecasts on deployment levels and climate effects very difficult. However, uncertainty about projections should not preclude pursuing beneficial bioenergy options.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

El rasgo sobresaliente de nuestra situación contemporánea consiste en que ésta se halla penetrada –de una manera cada vez más amplia y más profunda– por el impacto de las diversas tecnologías y, en relación con el interés de nuestro tema, por las tecnologías biomédicas. En esta ocasión quisiéramos iniciar nuestra reflexión señalando el antecedente prestigioso y hondamente significativo de la cuestión del hombre en el pensamiento trágico de Sófocles, en su tragedia Antígona, y prolongar esta reflexión en el examen de dos trabajos contemporáneos y convergentes en su preocupación filosófica y humanística escritos –en dos momentos históricos diversos– por: Günther Anders, en lengua alemana, traducido después al español y, finalmente, José Ignacio Murillo, en español. Se trata aquí de exponer lo esencial de los trabajos de ambos estudiosos citados a fin de iluminar el tema de nuestra presentación. Este hecho nos hace inmediatamente deudores de sus mejores ideas y de su peculiar penetración en un asunto de continua actualidad: la dignidad intangible de la persona humana en los más diversos ángulos de consideración posible, en su relación inevitable con el impacto de las tecnologías en curso, las cuales afectan nuestra vida en múltiples dimensiones y significados.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A large part of western Manatee County is devoted to the growing of winter vegetables and citrus fruits. As in most of peninsular Florida, rainfall in the county during the growing season is not sufficient for crop production and large quantites of artesian water are used for irrigation. The large withdrawals of artesian water for irrigation result in a considerable decline of the artesian head in the western part of the county. This seasonal decline of the artesian head has become larger as the withdrawal of artesian water has increased. The lowering of the fresh-water head in some coastal areas in the State has resulted in an infiltration of sea water into the water-bearing formations. The presence of salty water in the artesian aquifer in parts of the coastal area of Manatee County indicates that sea water may also have entered the waterbearing formations in this area as a result of the decline of artesian pressure during the growing season. The purpose of the investigation is to make a detailed study of the geology and ground-water resources of the county, primarily to determine whether salt-water encroachment has occurred or is likely to occur in the coastal area. (PDF contains 38 pages.)