64 resultados para metadata schemes


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Résumé Le cancer du sein est le cancer le plus commun chez les femmes et est responsable de presque 30% de tous les nouveaux cas de cancer en Europe. On estime le nombre de décès liés au cancer du sein en Europe est à plus de 130.000 par an. Ces chiffres expliquent l'impact social considérable de cette maladie. Les objectifs de cette thèse étaient: (1) d'identifier les prédispositions et les mécanismes biologiques responsables de l'établissement des sous-types spécifiques de cancer du sein; (2) les valider dans un modèle ín vivo "humain-dans-souris"; et (3) de développer des traitements spécifiques à chaque sous-type de cancer du sein identifiés. Le premier objectif a été atteint par l'intermédiaire de l'analyse des données d'expression de gènes des tumeurs, produite dans notre laboratoire. Les données obtenues par puces à ADN ont été produites à partir de 49 biopsies des tumeurs du sein provenant des patientes participant dans l'essai clinique EORTC 10994/BIG00-01. Les données étaient très riches en information et m'ont permis de valider des données précédentes des autres études d'expression des gènes dans des tumeurs du sein. De plus, cette analyse m'a permis d'identifier un nouveau sous-type biologique de cancer du sein. Dans la première partie de la thèse, je décris I identification des tumeurs apocrines du sein par l'analyse des puces à ADN et les implications potentielles de cette découverte pour les applications cliniques. Le deuxième objectif a été atteint par l'établissement d'un modèle de cancer du sein humain, basé sur des cellules épithéliales mammaires humaines primaires (HMECs) dérivées de réductions mammaires. J'ai choisi d'adapter un système de culture des cellules en suspension basé sur des mammosphères précédemment décrit et pat décidé d'exprimer des gènes en utilisant des lentivirus. Dans la deuxième partie de ma thèse je décris l'établissement d'un système de culture cellulaire qui permet la transformation quantitative des HMECs. Par la suite, j'ai établi un modèle de xénogreffe dans les souris immunodéficientes NOD/SCID, qui permet de modéliser la maladie humaine chez la souris. Dans la troisième partie de ma thèse je décris et je discute les résultats que j'ai obtenus en établissant un modèle estrogène-dépendant de cancer du sein par transformation quantitative des HMECs avec des gènes définis, identifiés par analyse de données d'expression des gènes dans le cancer du sein. Les cellules transformées dans notre modèle étaient estrogène-dépendantes pour la croissance, diploïdes et génétiquement normales même après la culture cellulaire in vitro prolongée. Les cellules formaient des tumeurs dans notre modèle de xénogreffe et constituaient des métastases péritonéales disséminées et du foie. Afin d'atteindre le troisième objectif de ma thèse, j'ai défini et examiné des stratégies de traitement qui permettent réduire les tumeurs et les métastases. J'ai produit un modèle de cancer du sein génétiquement défini et positif pour le récepteur de l'estrogène qui permet de modéliser le cancer du sein estrogène-dépendant humain chez la souris. Ce modèle permet l'étude des mécanismes impliqués dans la formation des tumeurs et des métastases. Abstract Breast cancer is the most common cancer in women and accounts for nearly 30% of all new cancer cases in Europe. The number of deaths from breast cancer in Europe is estimated to be over 130,000 each year, implying the social impact of the disease. The goals of this thesis were first, to identify biological features and mechanisms --responsible for the establishment of specific breast cancer subtypes, second to validate them in a human-in-mouse in vivo model and third to develop specific treatments for identified breast cancer subtypes. The first objective was achieved via the analysis of tumour gene expression data produced in our lab. The microarray data were generated from 49 breast tumour biopsies that were collected from patients enrolled in the clinical trial EORTC 10994/BIG00-01. The data set was very rich in information and allowed me to validate data of previous breast cancer gene expression studies and to identify biological features of a novel breast cancer subtype. In the first part of the thesis I focus on the identification of molecular apacrine breast tumours by microarray analysis and the potential imptìcation of this finding for the clinics. The second objective was attained by the production of a human breast cancer model system based on primary human mammary epithelial cells {HMECs) derived from reduction mammoplasties. I have chosen to adopt a previously described suspension culture system based on mammospheres and expressed selected target genes using lentiviral expression constructs. In the second part of my thesis I mainly focus on the establishment of a cell culture system allowing for quantitative transformation of HMECs. I then established a xenograft model in immunodeficient NOD/SCID mice, allowing to model human disease in a mouse. In the third part of my thesis I describe and discuss the results that I obtained while establishing an oestrogen-dependent model of breast cancer by quantitative transformation of HMECs with defined genes identified after breast cancer gene expression data analysis. The transformed cells in our model are oestrogen-dependent for growth; remain diploid and genetically normal even after prolonged cell culture in vitro. The cells farm tumours and form disseminated peritoneal and liver metastases in our xenograft model. Along the lines of the third objective of my thesis I defined and tested treatment schemes allowing reducing tumours and metastases. I have generated a genetically defined model of oestrogen receptor alpha positive human breast cancer that allows to model human oestrogen-dependent breast cancer in a mouse and enables the study of mechanisms involved in tumorigenesis and metastasis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: Most existing methods for accelerated parallel imaging in MRI require additional data, which are used to derive information about the sensitivity profile of each radiofrequency (RF) channel. In this work, a method is presented to avoid the acquisition of separate coil calibration data for accelerated Cartesian trajectories. METHODS: Quadratic phase is imparted to the image to spread the signals in k-space (aka phase scrambling). By rewriting the Fourier transform as a convolution operation, a window can be introduced to the convolved chirp function, allowing a low-resolution image to be reconstructed from phase-scrambled data without prominent aliasing. This image (for each RF channel) can be used to derive coil sensitivities to drive existing parallel imaging techniques. As a proof of concept, the quadratic phase was applied by introducing an offset to the x(2) - y(2) shim and the data were reconstructed using adapted versions of the image space-based sensitivity encoding and GeneRalized Autocalibrating Partially Parallel Acquisitions algorithms. RESULTS: The method is demonstrated in a phantom (1 × 2, 1 × 3, and 2 × 2 acceleration) and in vivo (2 × 2 acceleration) using a 3D gradient echo acquisition. CONCLUSION: Phase scrambling can be used to perform parallel imaging acceleration without acquisition of separate coil calibration data, demonstrated here for a 3D-Cartesian trajectory. Further research is required to prove the applicability to other 2D and 3D sampling schemes. Magn Reson Med, 2014. © 2014 Wiley Periodicals, Inc.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Practical Note examines the nascent micro-insurance sector in West Bengal, paying particular attention to the corporate- NGO partnership model for micro-insurance distribution,which has been enabled by India's unique regulatory framework. We challenge the popularconstruction of this model as a 'win - win' for all parties by analysing conflicting understandings of micro-insurance schemes and their purposes by insurance companies, NGOs, and poorvillagers. The article also considers the role of the specific political context of West Bengal inconstricting corporate- NGO micro-insurance

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding how communities of living organisms assemble has been a central question in ecology since the early days of the discipline. Disentangling the different processes involved in community assembly is not only interesting in itself but also crucial for an understanding of how communities will behave under future environmental scenarios. The traditional concept of assembly rules reflects the notion that species do not co-occur randomly but are restricted in their co-occurrence by interspecific competition. This concept can be redefined in a more general framework where the co-occurrence of species is a product of chance, historical patterns of speciation and migration, dispersal, abiotic environmental factors, and biotic interactions, with none of these processes being mutually exclusive. Here we present a survey and meta-analyses of 59 papers that compare observed patterns in plant communities with null models simulating random patterns of species assembly. According to the type of data under study and the different methods that are applied to detect community assembly, we distinguish four main types of approach in the published literature: species co-occurrence, niche limitation, guild proportionality and limiting similarity. Results from our meta-analyses suggest that non-random co-occurrence of plant species is not a widespread phenomenon. However, whether this finding reflects the individualistic nature of plant communities or is caused by methodological shortcomings associated with the studies considered cannot be discerned from the available metadata. We advocate that more thorough surveys be conducted using a set of standardized methods to test for the existence of assembly rules in data sets spanning larger biological and geographical scales than have been considered until now. We underpin this general advice with guidelines that should be considered in future assembly rules research. This will enable us to draw more accurate and general conclusions about the non-random aspect of assembly in plant communities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This Ph.D. dissertation seeks to study the work motivation of employees in the delivery of public services. The questioning on work motivation in public services in not new but it becomes central for governments which are now facing unprecedented public debts. The objective of this research is twofold : First, we want to see if the work motivation of employees in public services is a continuum (intrinsic and extrinsic motivations cannot coexist) or a bi-dimensional construct (intrinsic and extrinsic motivations coexist simultaneously). The research in public administration literature has focused on the concept of public service motivation, and considered motivation to be uni-dimensional (Perry and Hondeghem 2008). However, no study has yet tackled both types of motivation, the intrinsic and extrinsic ones, in the same time. This dissertation proposes, in Part I, a theoretical assessment and an empirical test of a global work motivational structure, by using a self-constructed Swiss dataset with employees from three public services, the education sector, the security sector and the public administrative services sector. Our findings suggest that work motivation in public services in not uni-dimensional but bi-dimensional, the intrinsic and extrinsic motivations coexist simultaneously and can be positively correlated (Amabile et al. 1994). Our findings show that intrinsic motivation is as important as extrinsic motivation, thus, the assumption that employees in public services are less attracted by extrinsic rewards is not confirmed for this sample. Other important finding concerns the public service motivation concept, which, as theoretically predicted, represents the major motivational dimension of employees in the delivery of public services. Second, the theory of public service motivation makes the assumption that employees in public services engage in activities that go beyond their self-interest, but never uses this construct as a determinant for their pro-social behavior. In the same time, several studies (Gregg et al. 2011 and Georgellis et al. 2011) bring evidence about the pro-social behavior of employees in public services. However, they do not identify which type of motivation is at the origin of this behavior, they only make the assumption of an intrinsically motivated behavior. We analyze the pro-social behavior of employees in public services and use the public service motivation as determinant of their pro-social behavior. We add other determinants highlighted by the theory of pro-social behavior (Bénabou and Tirole 2006), by Le Grand (2003) and by fit theories (Besley and Ghatak 2005). We test these determinants on Part II and identify for each sector of activity the positive or the negative impact on pro-social behavior of Swiss employees. Contrary to expectations, we find, for this sample, that both intrinsic and extrinsic factors have a positive impact on pro-social behavior, no crowding-out effect is identified in this sample. We confirm the hypothesis of Le Grand (2003) about the positive impact of the opportunity cost on pro-social behavior. Our results suggest a mix of action-oriented altruism and out-put oriented altruism of employees in public services. These results are relevant when designing incentives schemes for employees in the delivery of public services.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding tree recruitment is needed to forecast future forest distribution. Many studies have reported the relevant ecological factors that affect recruitment success in trees, but the potential for genetic-based differences in recruitment has often been neglected. In this study, we established a semi-natural reciprocal sowing experiment to test for local adaptation and microenvironment effects (evaluated here by canopy cover) in the emergence and early survival of maritime pine (Pinus pinaster Aiton), an emblematic Mediterranean forest tree. A novel application of molecular markers was also developed to test for family selection and, thus, for potential genetic change over generations. Overall, we did not find evidence to support local adaptation at the recruitment stage in our semi-natural experiment. Moreover, only weak family selection (if any) was found, suggesting that in stressful environments with low survival, stochastic processes and among-year climate variability may drive recruitment. Nevertheless, our study revealed that, at early stages of recruitment, microenvironments may favor the population with the best adapted life strategy, irrespectively of its (local or non-local) origin. We also found that emergence time is a key factor for seedling survival in stressful Mediterranean environments. Our study highlights the complexity of the factors influencing the early stages of establishment of maritime pine and provides insights into possible management actions aimed at environmental change impact mitigation. In particular, we found that the high stochasticity of the recruitment process in stressful environments and the differences in population-specific adaptive strategies may difficult assisted migration schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Despite the fact that there are more than twenty thousand biomedical journals in the world, research into the work of editors and publication process in biomedical and health care journals is rare. In December 2012, the Esteve Foundation, a non-profit scientific institution that fosters progress in pharmacotherapy by means of scientific communication and discussion organized a discussion group of 7 editors and/or experts in peer review biomedical publishing. They presented findings of past editorial research, discussed the lack of competitive funding schemes and specialized journals for dissemination of editorial research, and reported on the great diversity of misconduct and conflict of interest policies, as well as adherence to reporting guidelines. Furthermore, they reported on the reluctance of editors to investigate allegations of misconduct or increase the level of data sharing in health research. In the end, they concluded that if editors are to remain gatekeepers of scientific knowledge they should reaffirm their focus on the integrity of the scientific record and completeness of the data they publish. Additionally, more research should be undertaken to understand why many journals are not adhering to editorial standards, and what obstacles editors face when engaging in editorial research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Payments for Environmental Services (PES) are praised as innovative policy instruments and they influence the governance of forest restoration efforts in two major ways. The first is the establishment of multi-stakeholder agencies as intermediary bodies between funders and planters to manage the funds and to distribute incentives to planters. The second implication is that specific contracts assign objectives to land users in the form of conditions for payments that are believed to increase the chances for sustained impacts on the ground. These implications are important in the assessment of the potential of PES to operate as new and effective funding schemes for forest restoration. They are analyzed by looking at two prominent payments for watershed service programs in Indonesia-Cidanau (Banten province in Java) and West Lombok (Eastern Indonesia)-with combined economic and political science approaches. We derive lessons for the governance of funding efforts (e.g., multi-stakeholder agencies are not a guarantee of success; mixed results are obtained from a reliance on mandatory funding with ad hoc regulations, as opposed to voluntary contributions by the service beneficiary) and for the governance of financial expenditure (e.g., absolute need for evaluation procedures for the internal governance of farmer groups). Furthermore, we observe that these governance features provide no guarantee that restoration plots with the highest relevance for ecosystem services are targeted by the PES

Relevância:

10.00% 10.00%

Publicador:

Resumo:

SUMMARY: ExpressionView is an R package that provides an interactive graphical environment to explore transcription modules identified in gene expression data. A sophisticated ordering algorithm is used to present the modules with the expression in a visually appealing layout that provides an intuitive summary of the results. From this overview, the user can select individual modules and access biologically relevant metadata associated with them. AVAILABILITY: http://www.unil.ch/cbg/ExpressionView. Screenshots, tutorials and sample data sets can be found on the ExpressionView web site.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Accurate modeling of flow instabilities requires computational tools able to deal with several interacting scales, from the scale at which fingers are triggered up to the scale at which their effects need to be described. The Multiscale Finite Volume (MsFV) method offers a framework to couple fine-and coarse-scale features by solving a set of localized problems which are used both to define a coarse-scale problem and to reconstruct the fine-scale details of the flow. The MsFV method can be seen as an upscaling-downscaling technique, which is computationally more efficient than standard discretization schemes and more accurate than traditional upscaling techniques. We show that, although the method has proven accurate in modeling density-driven flow under stable conditions, the accuracy of the MsFV method deteriorates in case of unstable flow and an iterative scheme is required to control the localization error. To avoid large computational overhead due to the iterative scheme, we suggest several adaptive strategies both for flow and transport. In particular, the concentration gradient is used to identify a front region where instabilities are triggered and an accurate (iteratively improved) solution is required. Outside the front region the problem is upscaled and both flow and transport are solved only at the coarse scale. This adaptive strategy leads to very accurate solutions at roughly the same computational cost as the non-iterative MsFV method. In many circumstances, however, an accurate description of flow instabilities requires a refinement of the computational grid rather than a coarsening. For these problems, we propose a modified iterative MsFV, which can be used as downscaling method (DMsFV). Compared to other grid refinement techniques the DMsFV clearly separates the computational domain into refined and non-refined regions, which can be treated separately and matched later. This gives great flexibility to employ different physical descriptions in different regions, where different equations could be solved, offering an excellent framework to construct hybrid methods.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: To compare the different schemes that have been proposed during the last thirteen years to explain the renewal of the corneal epithelium. Material and Methods:We analyzed all the data present in the literature to explain the renewal of the corneal epithelium in mammals. According to the schemes proposed in the literature we developed a 3D animation to facilitate the understanding of the different concepts. Results:Three different schemes have been proposed to explain the renewal of the corneal epithelium in mammals during the last thirteen years. 1950-1981: the corneal epithelium was thought being renewed by mitosis of cells located in the basal layer. At this time scientist were not talking about stem cells. 1981-1986 was the period of the "XYZ hypothesis" or the transdifferentiation paradigm. At this time the conjunctival epithelium renewed the corneal epithelium in a centripetal migration. 1986-2008: the limbal stem cell paradigm, there were no stem cells in the corneal epithelium, all the corneal stem cells were located in the limbus and renewed the central cornea after a migration of 6 to 7 mm of transient amplifying cells toward the centre of the cornea. 2008, epithelial stem cells were found in the central cornea in mammals (Nature, Majo et al. November 2008). Discussion:We thought that the renewal of the corneal epithelium was completely defined. According to the last results we published in Nature, the current paradigm will be revisited. The experiments we made were on animals and the final demonstration on human has still to be done. If we find the same results in human, a new paradigm will be define and will change the way we consider ocular surface therapy and reconstruction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One major methodological problem in analysis of sequence data is the determination of costs from which distances between sequences are derived. Although this problem is currently not optimally dealt with in the social sciences, it has some similarity with problems that have been solved in bioinformatics for three decades. In this article, the authors propose an optimization of substitution and deletion/insertion costs based on computational methods. The authors provide an empirical way of determining costs for cases, frequent in the social sciences, in which theory does not clearly promote one cost scheme over another. Using three distinct data sets, the authors tested the distances and cluster solutions produced by the new cost scheme in comparison with solutions based on cost schemes associated with other research strategies. The proposed method performs well compared with other cost-setting strategies, while it alleviates the justification problem of cost schemes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

PURPOSE: To compare 3 different flow targeted magnetization preparation strategies for coronary MR angiography (cMRA), which allow selective visualization of the vessel lumen. MATERIAL AND METHODS: The right coronary artery of 10 healthy subjects was investigated on a 1.5 Tesla MR system (Gyroscan ACS-NT, Philips Healthcare, Best, NL). A navigator-gated and ECG-triggered 3D radial steady-state free-precession (SSFP) cMRA sequence with 3 different magnetization preparation schemes was performed referred to as projection SSFP (selective labeling of the aorta, subtraction of 2 data sets), LoReIn SSFP (double-inversion preparation, selective labeling of the aorta, 1 data set), and inflow SSFP (inversion preparation, selective labeling of the coronary artery, 1 data set). Signal-to-noise ratio (SNR) of the coronary artery and aorta, contrast-to-noise ratio (CNR) between the coronary artery and epicardial fat, vessel length and vessel sharpness were analyzed. RESULTS: All cMRA sequences were successfully obtained in all subjects. Both projection SSFP and LoReIn SSFP allowed for selective visualization of the coronary arteries with excellent background suppression. Scan time was doubled in projection SSFP because of the need for subtraction of 2 data sets. In inflow SSFP, background suppression was limited to the tissue included in the inversion volume. Projection SSFP (SNR(coro): 25.6 +/- 12.1; SNR(ao): 26.1 +/- 16.8; CNR(coro-fat): 22.0 +/- 11.7) and inflow SSFP (SNR(coro): 27.9 +/- 5.4; SNR(ao): 37.4 +/- 9.2; CNR(coro-fat): 24.9 +/- 4.8) yielded significantly increased SNR and CNR compared with LoReIn SSFP (SNR(coro): 12.3 +/- 5.4; SNR(ao): 11.8 +/- 5.8; CNR(coro-fat): 9.8 +/- 5.5; P < 0.05 for both). Longest visible vessel length was found with projection SSFP (79.5 mm +/- 18.9; P < 0.05 vs. LoReIn) whereas vessel sharpness was best in inflow SSFP (68.2% +/- 4.5%; P < 0.05 vs. LoReIn). Consistently good image quality was achieved using inflow SSFP likely because of the simple planning procedure and short scanning time. CONCLUSION: Three flow targeted cMRA approaches are presented, which provide selective visualization of the coronary vessel lumen and in addition blood flow information without the need of contrast agent administration. Inflow SSFP yielded highest SNR, CNR and vessel sharpness and may prove useful as a fast and efficient approach for assessing proximal and mid vessel coronary blood flow, whereas requiring less planning skills than projection SSFP or LoReIn SSFP.