998 resultados para Postsynaptic density targeting


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper uses the data of 1338 rural households in the Northern Mountainous Region of Vietnam to examine the extent to which subsidised credit targets the poor and its impacts. Principal Component Analysis and Propensity Score Matching were used to evaluate the depth of outreach and the income impact of credit. To address the problem of model uncertainty, the approach of Bayesian Model Average applied to the probit model was used. Results showed that subsidised credit successfully targeted the poor households with 24.10% and 69.20% of clients falling into the poorest group and the three bottom groups respectively. Moreover, those who received subsidised credit make up 83% of ethnic minority households. These results indicate that governmental subsidies are necessary to reach the poor and low income households, who need capital but are normally bypassed by commercial banks. Analyses also showed that ethnicity and age of household heads, number of helpers, savings, as well as how affected households are by shocks were all factors that further explained the probability at which subsidised credit has been assessed. Furthermore, recipients obtained a 2.61% higher total income and a 5.93% higher farm income compared to non-recipients. However, these small magnitudes of effects are statistically insignificant at a 5% level. Although the subsidised credit is insufficient to significantly improve the income of the poor households, it possibly prevents these households of becoming even poorer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We formulate density estimation as an inverse operator problem. We then use convergence results of empirical distribution functions to true distribution functions to develop an algorithm for multivariate density estimation. The algorithm is based upon a Support Vector Machine (SVM) approach to solving inverse operator problems. The algorithm is implemented and tested on simulated data from different distributions and different dimensionalities, gaussians and laplacians in $R^2$ and $R^{12}$. A comparison in performance is made with Gaussian Mixture Models (GMMs). Our algorithm does as well or better than the GMMs for the simulations tested and has the added advantage of being automated with respect to parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we focus on the problem of estimating a bounded density using a finite combination of densities from a given class. We consider the Maximum Likelihood Procedure (MLE) and the greedy procedure described by Li and Barron. Approximation and estimation bounds are given for the above methods. We extend and improve upon the estimation results of Li and Barron, and in particular prove an $O(\\frac{1}{\\sqrt{n}})$ bound on the estimation error which does not depend on the number of densities in the estimated combination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High density, uniform GaN nanodot arrays with controllable size have been synthesized by using template-assisted selective growth. The GaN nanodots with average diameter 40nm, 80nm and 120nm were selectively grown by metalorganic chemical vapor deposition (MOCVD) on a nano-patterned SiO2/GaN template. The nanoporous SiO2 on GaN surface was created by inductively coupled plasma etching (ICP) using anodic aluminum oxide (AAO) template as a mask. This selective regrowth results in highly crystalline GaN nanodots confirmed by high resolution transmission electron microscopy. The narrow size distribution and uniform spatial position of the nanoscale dots offer potential advantages over self-assembled dots grown by the Stranski–Krastanow mode.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have developed a system to hunt and reuse special gene integration sites that allow for high and stable gene expression. A vector, named pRGFP8, was constructed. The plasmid pRGFP8 contains a reporter gene, gfp2 and two extraneous DNA fragments. The gene gfp2 makes it possible to screen the high expression regions on the chromosome. The extraneous DNA fragments can help to create the unique loci on the chromosome and increase the gene targeting frequency by increasing the homology. After transfection into Chinese hamster ovary cells (CHO) cells, the linearized pRGFP8 can integrate into the chromosome of the host cells and form the unique sites. With FACS, 90 millions transfected cells were sorted and the cells with strongest GFP expression were isolated, and then 8 stable high expression GFP CHO cell lines were selected as candidates for the new host cell. Taking the unique site created by pRGFP8 on the chromosome in the new host cells as a targeting locus, the gfp2 gene was replaced with the gene of interest, human ifngamma, by transfecting the targeting plasmid pRIH-IFN. Then using FACS, the cells with the dimmest GFP fluorescence were selected. These cells showed they had strong abilities to produce the protein of interest, IFN-gamma. During the gene targeting experiment, we found there is positive correlation between the fluorescence density of the GFP CHO host cells and the specific production rate of IFN-gamma. This result shows that the strategy in our expression system is correct: the production of the interesting protein increases with the increase fluorescence of the GFP host cells. This system, the new host cell lines and the targeting vector, can be utilized for highly expressing the gene of interest. More importantly, by using FACS, we can fully screen all the transfected cells, which can reduce the chances of losing the best cells.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Compositional data analysis motivated the introduction of a complete Euclidean structure in the simplex of D parts. This was based on the early work of J. Aitchison (1986) and completed recently when Aitchinson distance in the simplex was associated with an inner product and orthonormal bases were identified (Aitchison and others, 2002; Egozcue and others, 2003). A partition of the support of a random variable generates a composition by assigning the probability of each interval to a part of the composition. One can imagine that the partition can be refined and the probability density would represent a kind of continuous composition of probabilities in a simplex of infinitely many parts. This intuitive idea would lead to a Hilbert-space of probability densities by generalizing the Aitchison geometry for compositions in the simplex into the set probability densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel density estimation techniques in the context of compositional data analysis. Indeed, they gave two options for the choice of the kernel to be used in the kernel estimator. One of these kernels is based on the use the alr transformation on the simplex SD jointly with the normal distribution on RD-1. However, these authors themselves recognized that this method has some deficiencies. A method for overcoming these dificulties based on recent developments for compositional data analysis and multivariate kernel estimation theory, combining the ilr transformation with the use of the normal density with a full bandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu- Figueras (2006). Here we present an extensive simulation study that compares both methods in practice, thus exploring the finite-sample behaviour of both estimators

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional Data Analysis (FDA) deals with samples where a whole function is observed for each individual. A particular case of FDA is when the observed functions are density functions, that are also an example of infinite dimensional compositional data. In this work we compare several methods for dimensionality reduction for this particular type of data: functional principal components analysis (PCA) with or without a previous data transformation and multidimensional scaling (MDS) for diferent inter-densities distances, one of them taking into account the compositional nature of density functions. The difeerent methods are applied to both artificial and real data (households income distributions)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a new approach to model and classify breast parenchymal tissue. Given a mammogram, first, we will discover the distribution of the different tissue densities in an unsupervised manner, and second, we will use this tissue distribution to perform the classification. We achieve this using a classifier based on local descriptors and probabilistic Latent Semantic Analysis (pLSA), a generative model from the statistical text literature. We studied the influence of different descriptors like texture and SIFT features at the classification stage showing that textons outperform SIFT in all cases. Moreover we demonstrate that pLSA automatically extracts meaningful latent aspects generating a compact tissue representation based on their densities, useful for discriminating on mammogram classification. We show the results of tissue classification over the MIAS and DDSM datasets. We compare our method with approaches that classified these same datasets showing a better performance of our proposal

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It has been shown that the accuracy of mammographic abnormality detection methods is strongly dependent on the breast tissue characteristics, where a dense breast drastically reduces detection sensitivity. In addition, breast tissue density is widely accepted to be an important risk indicator for the development of breast cancer. Here, we describe the development of an automatic breast tissue classification methodology, which can be summarized in a number of distinct steps: 1) the segmentation of the breast area into fatty versus dense mammographic tissue; 2) the extraction of morphological and texture features from the segmented breast areas; and 3) the use of a Bayesian combination of a number of classifiers. The evaluation, based on a large number of cases from two different mammographic data sets, shows a strong correlation ( and 0.67 for the two data sets) between automatic and expert-based Breast Imaging Reporting and Data System mammographic density assessment

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A recent trend in digital mammography is computer-aided diagnosis systems, which are computerised tools designed to assist radiologists. Most of these systems are used for the automatic detection of abnormalities. However, recent studies have shown that their sensitivity is significantly decreased as the density of the breast increases. This dependence is method specific. In this paper we propose a new approach to the classification of mammographic images according to their breast parenchymal density. Our classification uses information extracted from segmentation results and is based on the underlying breast tissue texture. Classification performance was based on a large set of digitised mammograms. Evaluation involves different classifiers and uses a leave-one-out methodology. Results demonstrate the feasibility of estimating breast density using image processing and analysis techniques

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los gliomas malignos representan una de las formas más agresivas de los tumores del sistema nervioso central (SNC). De acuerdo con la clasificación de los tumores cerebrales de la Organización Mundial de la Salud (OMS), los astrocitomas han sido categorizados en cuatro grados, determinados por la patología subyacente. Es así como los gliomas malignos (o de alto grado) incluyen el glioma anaplásico (grado III) así como el glioblastoma multiforme (GBM, grado IV),estos últimos los más agresivos con el peor pronóstico (1). El manejo terapéutico de los tumores del SNC se basa en la cirugía, la radioterapia y la quimioterapia, dependiendo de las características del tumor, el estadio clínico y la edad (2),(3), sin embargo ninguno de los tratamientos estándar es completamente seguro y compatible con una calidad de vida aceptable (3), (4). En general, la quimioterapia es la primera opción en los tumores diseminados, como el glioblastoma invasivo y el meduloblastoma de alto riesgo o con metástasis múltiple, pero el pronóstico en estos pacientes es muy pobre (2),(3). Solamente nuevas terapias dirigidas (2) como las terapias anti-angiogénicas (4); o terapias génicas muestran un beneficio real en grupos limitados de pacientes con defectos moleculares específicos conocidos (4). De este modo, se hace necesario el desarrollo de nuevas terapias farmacológicas para atacar los tumores cerebrales. Frente a las terapias los gliomas malignos son con frecuencia quimioresistentes, y esta resistencia parece depender de al menos dos mecanismos: en primer lugar, la pobre penetración de muchas drogas anticáncer a través de la barrera hematoencefálica (BBB: Blood Brain Barrier), la barrera del fluido sangre-cerebroespinal (BCSFB: Blood-cerebrospinal fluid barrier) y la barrera sangre-tumor (BTB: blood-tumor barrier). Dicha resistencia se debe a la interacción de la droga con varios transportadores o bombas de eflujo de droga ABC (ABC: ATP-binding cassette) que se sobre expresan en las células endoteliales o epiteliales de estas barreras. En segundo lugar, estos transportadores de eflujo de drogas ABC propios de las células tumorales confieren un fenotipo conocido como resistencia a multidrogas (MDR: multidrug resistance), el cual es característico de varios tumores sólidos. Este fenotipo también está presente en los tumores del SNC y su papel en gliomas es objeto de investigación (5). Por consiguiente el suministro de medicamentos a través de la BBB es uno de los problemas vitales en los tratamientos de terapia dirigida. Estudios recientes han demostrado que algunas moléculas pequeñas utilizadas en estas terapias son sustratos de la glicoproteína P (Pgp: P-gycoprotein), así como también de otras bombas de eflujo como las proteínas relacionadas con la resistencia a multidrogas (MRPs: multidrug resistance-related proteins (MRPs) o la proteína relacionada con cáncer de seno (BCRP: breast-cancer resistance related protein)) que no permiten que las drogas de este tipo alcancen el tumor (1). Un sustrato de Pgp y BCRP es la DOXOrubicina (DOXO), un fármaco utilizado en la terapia anti cáncer, el cual es muy eficaz para atacar las células del tumor cerebral in vitro, pero con un uso clínico limitado por la poca entrega a través de la barrera hematoencefálica (BBB) y por la resistencia propia de los tumores. Por otra parte las células de BBB y las células del tumor cerebral tienen también proteínas superficiales, como el receptor de la lipoproteína de baja densidad (LDLR), que podría utilizarse como blanco terapéutico en BBB y tumores cerebrales. Es asi como la importancia de este estudio se basa en la generación de estrategias terapéuticas que promuevan el paso de las drogas a través de la barrera hematoencefalica y tumoral, y a su vez, se reconozcan mecanismos celulares que induzcan el incremento en la expresión de los transportadores ABC, de manera que puedan ser utilizados como blancos terapéuticos.Este estudio demostró que el uso de una nueva estrategia basada en el “Caballo de Troya”, donde se combina la droga DOXOrubicina, la cual es introducida dentro de un liposoma, salvaguarda la droga de manera que se evita su reconocimiento por parte de los transportadores ABC tanto de la BBB como de las células del tumor. La construcción del liposoma permitió utilizar el receptor LDLR de las células asegurando la entrada a través de la BBB y hacia las células tumorales a través de un proceso de endocitosis. Este mecanismo fue asociado al uso de estatinas o drogas anticolesterol las cuales favorecieron la expresión de LDLR y disminuyeron la actividad de los transportadores ABC por nitración de los mismos, incrementando la eficiencia de nuestro Caballo de Troya. Por consiguiente demostramos que el uso de una nueva estrategia o formulación denominada ApolipoDOXO más el uso de estatinas favorece la administración de fármacos a través de la BBB, venciendo la resistencia del tumor y reduciendo los efectos colaterales dosis dependiente de la DOXOrubicina. Además esta estrategia del "Caballo de Troya", es un nuevo enfoque terapéutico que puede ser considerado como una nueva estrategia para aumentar la eficacia de diferentes fármacos en varios tumores cerebrales y garantiza una alta eficiencia incluso en un medio hipóxico,característico de las células cancerosas, donde la expresión del transportador Pgp se vió aumentada. Teniendo en cuenta la relación entre algunas vías de señalización reconocidas como moduladores de la actividad de Pgp, este estudio presenta no solo la estrategia del Caballo de Troya, sino también otra propuesta terapéutica relacionada con el uso de Temozolomide más DOXOrubicina. Esta estrategia demostró que el temozolomide logra penetrar la BBB por que interviene en la via de señalización de la Wnt/GSK3/β-catenina, la cual modula la expresión del transportador Pgp. Se demostró que el TMZ disminuye la proteína y el mRNA de Wnt3 permitiendo plantear la hipótesis de que la droga al disminuir la transcripción del gen Wnt3 en células de BBB, incrementa la activación de la vía fosforilando la β-catenina y conduciendo a disminuir la β-catenina nuclear y por tanto su unión al promotor del gen mdr1. Con base en los resultados este estudio permitió el reconocimiento de tres mecanismos básicos relacionados con la expresión de los transportadores ABC y asociados a las estrategias empleadas: el primero fue el uso de las estatinas, el cual condujo a la nitración de los transportadores disminuyendo su actividad por la via del factor de transcripción NFκB; el segundo a partir del uso del temozolomide, el cual metila el gen de Wnt3 reduciendo la actividad de la via de señalización de la la β-catenina, disminuyendo la expresión del transportador Pgp. El tercero consistió en la determinación de la relación entre el eje RhoA/RhoA quinasa como un modulador de la via (no canónica) GSK3/β-catenina. Se demostró que la proteína quinasa RhoA promovió la activación de la proteína PTB1, la cual al fosforilar a GSK3 indujo la fosforilación de la β-catenina, lo cual dio lugar a su destrucción por el proteosoma, evitando su unión al promotor del gen mdr1 y por tanto reduciendo su expresión. En conclusión las estrategias propuestas en este trabajo incrementaron la citotoxicidad de las células tumorales al aumentar la permeabilidad no solo de la barrera hematoencefálica, sino también de la propia barrera tumoral. Igualmente, la estrategia del “Caballo de Troya” podría ser útil para la terapia de otras enfermedades asociadas al sistema nervioso central. Por otra parte estos estudios indican que el reconocimiento de mecanismos asociados a la expresión de los transportadores ABC podría constituir una herramienta clave en el desarrollo de nuevas terapias anticáncer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Civilians constitute a large share of casualties in civil wars across the world. They are targeted to create fear and punish allegiance with the enemy. This maximizes collaboration with the perpetrator and strengthens the support network necessary to consolidate control over contested regions. I develop a model of the magnitude and structure of civilian killings in civil wars involving two armed groups who Öght over territorial control. Armies secure compliance through a combination of carrots and sticks. In turn, civilians di§er from each other in their intrinsic preference towards one group. I explore the e§ect of the empowerment of one of the groups in the civilian death toll. There are two e§ects that go in opposite directions. While a direct e§ect makes the powerful group more lethal, there is an indirect e§ect by which the number of civilians who align with that group increases, leaving less enemy supporters to kill. I study the conditions under which there is one dominant e§ect and illustrate the predictions using sub-national longitudinal data for Colombiaís civil war.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The contributions of the correlated and uncorrelated components of the electron-pair density to atomic and molecular intracule I(r) and extracule E(R) densities and its Laplacian functions ∇2I(r) and ∇2E(R) are analyzed at the Hartree-Fock (HF) and configuration interaction (CI) levels of theory. The topologies of the uncorrelated components of these functions can be rationalized in terms of the corresponding one-electron densities. In contrast, by analyzing the correlated components of I(r) and E(R), namely, IC(r) and EC(R), the effect of electron Fermi and Coulomb correlation can be assessed at the HF and CI levels of theory. Moreover, the contribution of Coulomb correlation can be isolated by means of difference maps between IC(r) and EC(R) distributions calculated at the two levels of theory. As application examples, the He, Ne, and Ar atomic series, the C2-2, N2, O2+2 molecular series, and the C2H4 molecule have been investigated. For these atoms and molecules, it is found that Fermi correlation accounts for the main characteristics of IC(r) and EC(R), with Coulomb correlation increasing slightly the locality of these functions at the CI level of theory. Furthermore, IC(r), EC(R), and the associated Laplacian functions, reveal the short-ranged nature and high isotropy of Fermi and Coulomb correlation in atoms and molecules

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A topological analysis of intracule and extracule densities and their Laplacians computed within the Hartree-Fock approximation is presented. The analysis of the density distributions reveals that among all possible electron-electron interactions in atoms and between atoms in molecules only very few are located rigorously as local maxima. In contrast, they are clearly identified as local minima in the topology of Laplacian maps. The conceptually different interpretation of intracule and extracule maps is also discussed in detail. An application example to the C2H2, C2H4, and C2H6 series of molecules is presented