24 resultados para Eigenfunctions and fundamental solution
Resumo:
Nowadays, the joint exploitation of images acquired daily by remote sensing instruments and of images available from archives allows a detailed monitoring of the transitions occurring at the surface of the Earth. These modifications of the land cover generate spectral discrepancies that can be detected via the analysis of remote sensing images. Independently from the origin of the images and of type of surface change, a correct processing of such data implies the adoption of flexible, robust and possibly nonlinear method, to correctly account for the complex statistical relationships characterizing the pixels of the images. This Thesis deals with the development and the application of advanced statistical methods for multi-temporal optical remote sensing image processing tasks. Three different families of machine learning models have been explored and fundamental solutions for change detection problems are provided. In the first part, change detection with user supervision has been considered. In a first application, a nonlinear classifier has been applied with the intent of precisely delineating flooded regions from a pair of images. In a second case study, the spatial context of each pixel has been injected into another nonlinear classifier to obtain a precise mapping of new urban structures. In both cases, the user provides the classifier with examples of what he believes has changed or not. In the second part, a completely automatic and unsupervised method for precise binary detection of changes has been proposed. The technique allows a very accurate mapping without any user intervention, resulting particularly useful when readiness and reaction times of the system are a crucial constraint. In the third, the problem of statistical distributions shifting between acquisitions is studied. Two approaches to transform the couple of bi-temporal images and reduce their differences unrelated to changes in land cover are studied. The methods align the distributions of the images, so that the pixel-wise comparison could be carried out with higher accuracy. Furthermore, the second method can deal with images from different sensors, no matter the dimensionality of the data nor the spectral information content. This opens the doors to possible solutions for a crucial problem in the field: detecting changes when the images have been acquired by two different sensors.
Resumo:
Dendritic cells (DCs) are leukocytes specialised in the uptake, processing, and presentation of antigen and fundamental in regulating both innate and adaptive immune functions. They are mainly localised at the interface between body surfaces and the environment, continuously scrutinising incoming antigen for the potential threat it may represent to the organism. In the respiratory tract, DCs constitute a tightly enmeshed network, with the most prominent populations localised in the epithelium of the conducting airways and lung parenchyma. Their unique localisation enables them to continuously assess inhaled antigen, either inducing tolerance to inoffensive substances, or initiating immunity against a potentially harmful pathogen. This immunological homeostasis requires stringent control mechanisms to protect the vital and fragile gaseous exchange barrier from unrestrained and damaging inflammation, or an exaggerated immune response to an innocuous allergen, such as in allergic asthma. During DC activation, there is upregulation of co-stimulatory molecules and maturation markers, enabling DC to activate naïve T cells. This activation is accompanied by chemokine and cytokine release that not only serves to amplify innate immune response, but also determines the type of effector T cell population generated. An increasing body of recent literature provides evidence that different DC subpopulations, such as myeloid DC (mDC) and plasmacytoid DC (pDC) in the lungs occupy a key position at the crossroads between tolerance and immunity. This review aims to provide the clinician and researcher with a summary of the latest insights into DC-mediated pulmonary immune regulation and its relevance for developing novel therapeutic strategies for various disease conditions such as infection, asthma, COPD, and fibrotic lung disease.
Resumo:
The relationship between the structures of protein-ligand complexes existing in the crystal and in solution, essential in the case of fragment-based screening by X-ray crystallography (FBS-X), has been often an object of controversy. To address this question, simultaneous co-crystallization and soaking of two inhibitors with different ratios, Fidarestat (FID; K(d) = 6.5 nM) and IDD594 (594; K(d) = 61 nM), which bind to h-aldose reductase (AR), have been performed. The subatomic resolution of the crystal structures allows the differentiation of both inhibitors, even when the structures are almost superposed. We have determined the occupation ratio in solution by mass spectrometry (MS) Occ(FID)/Occ(594) = 2.7 and by X-ray crystallography Occ(FID)/Occ(594) = 0.6. The occupancies in the crystal and in solution differ 4.6 times, implying that ligand binding potency is influenced by crystal contacts. A structural analysis shows that the Loop A (residues 122-130), which is exposed to the solvent, is flexible in solution, and is involved in packing contacts within the crystal. Furthermore, inhibitor 594 contacts the base of Loop A, stabilizing it, while inhibitor FID does not. This is shown by the difference in B-factors of the Loop A between the AR-594 and AR-FID complexes. A stable loop diminishes the entropic energy barrier to binding, favoring 594 versus FID. Therefore, the effect of the crystal environment should be taken into consideration in the X-ray diffraction analysis of ligand binding to proteins. This conclusion highlights the need for additional methodologies in the case of FBS-X to validate this powerful screening technique, which is widely used.
Resumo:
Over the last century, numerous techniques have been developed to analyze the movement of humans while walking and running. The combined use of kinematics and kinetics methods, mainly based on high speed video analysis and forceplate, have permitted a comprehensive description of locomotion process in terms of energetics and biomechanics. While the different phases of a single gait cycle are well understood, there is an increasing interest to know how the neuro-motor system controls gait form stride to stride. Indeed, it was observed that neurodegenerative diseases and aging could impact gait stability and gait parameters steadiness. From both clinical and fundamental research perspectives, there is therefore a need to develop techniques to accurately track gait parameters stride-by-stride over a long period with minimal constraints to patients. In this context, high accuracy satellite positioning can provide an alternative tool to monitor outdoor walking. Indeed, the high-end GPS receivers provide centimeter accuracy positioning with 5-20 Hz sampling rate: this allows the stride-by-stride assessment of a number of basic gait parameters--such as walking speed, step length and step frequency--that can be tracked over several thousand consecutive strides in free-living conditions. Furthermore, long-range correlations and fractal-like pattern was observed in those time series. As compared to other classical methods, GPS seems a promising technology in the field of gait variability analysis. However, relative high complexity and expensiveness--combined with a usability which requires further improvement--remain obstacles to the full development of the GPS technology in human applications.
Resumo:
In arson cases, the collection and detection of traces of ignitable liquids on a suspect's hands can provide information to a forensic investigation. Police forces currently lack a simple, robust, efficient and reliable solution to perform this type of swabbing. In this article, we describe a study undertaken to develop a procedure for the collection of ignitable liquid residues on the hands of arson suspects. Sixteen different collection supports were considered and their applicability for the collection of gasoline traces present on hands and their subsequent analysis in a laboratory was evaluated. Background contamination, consisting of volatiles emanating from the collection supports, and collection efficiencies of the different sampling materials were assessed by passive headspace extraction with an activated charcoal strip (DFLEX device) followed by gas chromatography-mass spectrometry (GC-MS) analysis. After statistical treatment of the results, non-powdered latex gloves were retained as the most suitable method of sampling. On the basis of the obtained results, a prototype sampling kit was designed and tested. This kit is made of a three compartment multilayer bag enclosed in a sealed metal can and containing three pairs of non-powdered latex gloves: one to be worn by the sampler, one consisting of a blank sample and the last one to be worn by the person suspected to have been in contact with ignitable liquids. The design of the kit was developed to be efficient in preventing external and cross-contaminations.
Resumo:
PURPOSE OF REVIEW: Intensive insulin therapy titrated to restore and maintain blood glucose between 80 and 110 mg/dl (4.4-6.1 mmol/l) was found to improve survival of critically ill patients in one pioneering proof-of-concept study performed in a surgical intensive care unit. The external validity of these findings was investigated. RECENT FINDINGS: Six independent prospective randomized controlled trials, involving 9877 patients in total, were unable to confirm the survival benefit reported in the pioneering trial. Several hypotheses were proposed to explain this discrepancy, including the case-mix, the features of the usual care, the quality of glucose control and the risks associated with hypoglycemia. SUMMARY: Before a better understanding and delineation of the conditions associated with and improved outcome by tight glycemic control, the choice of an intermediate glycemic target appears as a safe and effective solution.
Resumo:
Résumé: Les gouvernements des pays occidentaux ont dépensé des sommes importantes pour faciliter l'intégration des technologies de l'information et de la communication dans l'enseignement espérant trouver une solution économique à l'épineuse équation que l'on pourrait résumer par la célèbre formule " faire plus et mieux avec moins ". Cependant force est de constater que, malgré ces efforts et la très nette amélioration de la qualité de service des infrastructures, cet objectif est loin d'être atteint. Si nous pensons qu'il est illusoire d'attendre et d'espérer que la technologie peut et va, à elle seule, résoudre les problèmes de qualité de l'enseignement, nous croyons néanmoins qu'elle peut contribuer à améliorer les conditions d'apprentissage et participer de la réflexion pédagogique que tout enseignant devrait conduire avant de dispenser ses enseignements. Dans cette optique, et convaincu que la formation à distance offre des avantages non négligeables à condition de penser " autrement " l'enseignement, nous nous sommes intéressé à la problématique du développement de ce type d'applications qui se situent à la frontière entre les sciences didactiques, les sciences cognitives, et l'informatique. Ainsi, et afin de proposer une solution réaliste et simple permettant de faciliter le développement, la mise-à-jour, l'insertion et la pérennisation des applications de formation à distance, nous nous sommes impliqué dans des projets concrets. Au fil de notre expérience de terrain nous avons fait le constat que (i)la qualité des modules de formation flexible et à distance reste encore très décevante, entre autres parce que la valeur ajoutée que peut apporter l'utilisation des technologies n'est, à notre avis, pas suffisamment exploitée et que (ii)pour réussir tout projet doit, outre le fait d'apporter une réponse utile à un besoin réel, être conduit efficacement avec le soutien d'un " champion ". Dans l'idée de proposer une démarche de gestion de projet adaptée aux besoins de la formation flexible et à distance, nous nous sommes tout d'abord penché sur les caractéristiques de ce type de projet. Nous avons ensuite analysé les méthodologies de projet existantes dans l'espoir de pouvoir utiliser l'une, l'autre ou un panachage adéquat de celles qui seraient les plus proches de nos besoins. Nous avons ensuite, de manière empirique et par itérations successives, défini une démarche pragmatique de gestion de projet et contribué à l'élaboration de fiches d'aide à la décision facilitant sa mise en oeuvre. Nous décrivons certains de ses acteurs en insistant particulièrement sur l'ingénieur pédagogique que nous considérons comme l'un des facteurs clé de succès de notre démarche et dont la vocation est de l'orchestrer. Enfin, nous avons validé a posteriori notre démarche en revenant sur le déroulement de quatre projets de FFD auxquels nous avons participé et qui sont représentatifs des projets que l'on peut rencontrer dans le milieu universitaire. En conclusion nous pensons que la mise en oeuvre de notre démarche, accompagnée de la mise à disposition de fiches d'aide à la décision informatisées, constitue un atout important et devrait permettre notamment de mesurer plus aisément les impacts réels des technologies (i) sur l'évolution de la pratique des enseignants, (ii) sur l'organisation et (iii) sur la qualité de l'enseignement. Notre démarche peut aussi servir de tremplin à la mise en place d'une démarche qualité propre à la FFD. D'autres recherches liées à la réelle flexibilisation des apprentissages et aux apports des technologies pour les apprenants pourront alors être conduites sur la base de métriques qui restent à définir. Abstract: Western countries have spent substantial amount of monies to facilitate the integration of the Information and Communication Technologies (ICT) into Education hoping to find a solution to the touchy equation that can be summarized by the famous statement "do more and better with less". Despite these efforts, and notwithstanding the real improvements due to the undeniable betterment of the infrastructure and of the quality of service, this goal is far from reached. Although we think it illusive to expect technology, all by itself, to solve our economical and educational problems, we firmly take the view that it can greatly contribute not only to ameliorate learning conditions but participate to rethinking the pedagogical approach as well. Every member of our community could hence take advantage of this opportunity to reflect upon his or her strategy. In this framework, and convinced that integrating ICT into education opens a number of very interesting avenues provided we think teaching "out of the box", we got ourself interested in courseware development positioned at the intersection of didactics and pedagogical sciences, cognitive sciences and computing. Hence, and hoping to bring a realistic and simple solution that could help develop, update, integrate and sustain courseware we got involved in concrete projects. As ze gained field experience we noticed that (i)The quality of courseware is still disappointing, amongst others, because the added value that the technology can bring is not made the most of, as it could or should be and (ii)A project requires, besides bringing a useful answer to a real problem, to be efficiently managed and be "championed". Having in mind to propose a pragmatic and practical project management approach we first looked into open and distance learning characteristics. We then analyzed existing methodologies in the hope of being able to utilize one or the other or a combination to best fit our needs. In an empiric manner and proceeding by successive iterations and refinements, we defined a simple methodology and contributed to build descriptive "cards" attached to each of its phases to help decision making. We describe the different actors involved in the process insisting specifically on the pedagogical engineer, viewed as an orchestra conductor, whom we consider to be critical to ensure the success of our approach. Last but not least, we have validated a posteriori our methodology by reviewing four of the projects we participated to and that we think emblematic of the university reality. We believe that the implementation of our methodology, along with the availability of computerized cards to help project managers to take decisions, could constitute a great asset and contribute to measure the technologies' real impacts on (i) the evolution of teaching practices (ii) the organization and (iii) the quality of pedagogical approaches. Our methodology could hence be of use to help put in place an open and distance learning quality assessment. Research on the impact of technologies to learning adaptability and flexibilization could rely on adequate metrics.
Resumo:
This research examines the impacts of the Swiss reform of the allocation of tasks which was accepted in 2004 and implemented in 2008 to "re-assign" the responsibilities between the federal government and the cantons. The public tasks were redistributed, according to the leading and fundamental principle of subsidiarity. Seven tasks came under exclusive federal responsibility; ten came under the control of the cantons; and twenty-two "common tasks" were allocated to both the Confederation and the cantons. For these common tasks it wasn't possible to separate the management and the implementation. In order to deal with nineteen of them, the reform introduced the conventions-programs (CPs), which are public law contracts signed by the Confederation with each canton. These CPs are generally valid for periods of four years (2008-11, 2012-15 and 2016-19, respectively). The third period is currently being prepared. By using the principal-agent theory I examine how contracts can improve political relations between a principal (Confederation) and an agent (canton). I also provide a first qualitative analysis by examining the impacts of these contracts on the vertical cooperation and on the implication of different actors by focusing my study on five CPs - protection of cultural heritage and conservation of historic monuments, encouragement of the integration of foreigners, economic development, protection against noise and protection of the nature and landscape - applied in five cantons, which represents twenty-five cases studies.
Resumo:
The NG2(+) glia, also known as polydendrocytes or oligodendrocyte precursor cells, represent a new entity among glial cell populations in the central nervous system. However, the complete repertoire of their roles is not yet identified. The embryonic NG2(+) glia originate from the Nkx2.1(+) progenitors of the ventral telencephalon. Our analysis unravels that, beginning from E12.5 until E16.5, the NG2(+) glia populate the entire dorsal telencephalon. Interestingly, their appearance temporally coincides with the establishment of blood vessel network in the embryonic brain. NG2(+) glia are closely apposed to developing cerebral vessels by being either positioned at the sprouting tip cells or tethered along the vessel walls. Absence of NG2(+) glia drastically affects the vascular development leading to severe reduction of ramifications and connections by E18.5. By revealing a novel and fundamental role for NG2(+) glia, our study brings new perspectives to mechanisms underlying proper vessels network formation in embryonic brains.