998 resultados para colate detritiche, terreni granulari, prove triax ACU e CSD


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce two ways of comparing information structures, say ${\cal I}$ and${\cal J}$. First we say that ${\cal I}$ is richer than ${\cal J}$ when forevery compact game $G$, all correlated equilibrium distributions of $G$ inducedby ${\cal J}$ are also induced by ${\cal I}$. Second, we say that ${\cal J}$is faithfully reproducable from ${\cal I}$ when all the players can computefrom their information in ${\cal I}$ ``new information'' that they could havereceived from ${\cal J}$. We prove that ${\cal I}$ is richer than ${\cal J}$if and only if ${\cal J}$ is faithfully reproducable from ${\cal I}$.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Minkowski's ?(x) function can be seen as the confrontation of two number systems: regular continued fractions and the alternated dyadic system. This way of looking at it permits us to prove that its derivative, as it also happens for many other non-decreasing singular functions from [0,1] to [0,1], when it exists can only attain two values: zero and infinity. It is also proved that if the average of the partial quotients in the continued fraction expansion of x is greater than k* =5.31972, and ?'(x) exists then ?'(x)=0. In the same way, if the same average is less than k**=2 log2(F), where F is the golden ratio, then ?'(x)=infinity. Finally some results are presented concerning metric properties of continued fraction and alternated dyadic expansions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies the rate of convergence of an appropriatediscretization scheme of the solution of the Mc Kean-Vlasovequation introduced by Bossy and Talay. More specifically,we consider approximations of the distribution and of thedensity of the solution of the stochastic differentialequation associated to the Mc Kean - Vlasov equation. Thescheme adopted here is a mixed one: Euler/weakly interactingparticle system. If $n$ is the number of weakly interactingparticles and $h$ is the uniform step in the timediscretization, we prove that the rate of convergence of thedistribution functions of the approximating sequence in the $L^1(\Omega\times \Bbb R)$ norm and in the sup norm is of theorder of $\frac 1{\sqrt n} + h $, while for the densities is ofthe order $ h +\frac 1 {\sqrt {nh}}$. This result is obtainedby carefully employing techniques of Malliavin Calculus.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta dissertação visou compreender o caráter temático e discursivo de algumas crónicas literárias de dois autores cabo-verdianos, Dina Salústio e Daniel Medina, e provar o carácter pedagógico e humanitário das composições. Fez-se, num primeiro momento, um estudo de conteúdos teóricos sobre a cronística portuguesa, visando recensear aspetos relacionados com a evolução do género ao longo dos tempos, desde a Idade Média até à Contemporaneidade, sempre numa perspetiva pragmática, e discutiram-se as características enformadoras do género, sendo assinaláveis a subjetividade, versatilidade, objetividade, estilo entre o oral e o literário, quotidianidade e dialogismo. De seguida, em termos metodológicos, optou-se por convocar o ethos de Amossy e Maingueneau para o estudo das composições. Num segundo momento, através de dez crónicas literárias cabo-verdianas, sendo cinco de cada autor, procedeu-se à análise crítica das quais se confirmou, a partir da análise temática, uma preocupação pedagógica e humanitária uma vez que se notou uma certa inquietação com problemas que envolvem a idiossincrasia do homem cabo-verdiano, como a perda de valores em termos educacionais e instrucionais, quer no seio familiar como escolar, a preocupação com os mais desfavorecidos materialmente, onde se destacam crianças órfãs de pais vivos, o descaso com os doentes mentais e as crianças de rua, a gravidez precoce, a prostituição infantil, entre outros. O modo, como os cronistas se posicionam face à abordagem dos temas, permitiu-nos projetar o ethos discursivo de duas personalidades sensíveis, sérias e comprometidas com os valores mais nobres que enformam o ser humano. Do mesmo modo, pôde constatar como características relevantes das composições a subjetividade e pessoalidade discursivas, a brevidade, o diálogo no estilo indirecto livre, a quotidianidade e literariedade aliados a alguns recursos retóricos que nos permitiram apurar os tons sério e irónico do sujeito enunciador ao tratar as questões temáticas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O objectivo deste trabalho é desenvolver uma breve contextualização sobre o impacto dos impostos nos orçamentos das Autarquias Locais em Cabo Verde e a forma como as receitas recebidas pelo Estado central são depois distribuídas aos municípios. Convém ainda salientar que este trabalho visa analisar as políticas fiscais e os impostos em particular. Nele ir-se-á demonstrar e comprovar a contribuição das autarquias locais no desenvolvimento de Cabo Verde e a necessidade de serem transferidos mais recursos às câmaras municipais, visando maior e melhor desenvolvimento do país, tendo em conta a experiência acumulada durante as duas últimas décadas. Por outro lado, o presente estudo tem a finalidade de diagnosticar os principais problemas financeiros com que as autarquias locais se debatem, o que sugere a adopção de um conjunto de alterações nos actuais instrumentos financeiros. Algumas medidas aqui previstas, podem e devem ser adoptadas de imediato pelas autarquias locais. Desde logo, o estudo do Sistema Tributário visa promover a difusão de procedimentos amparados por lei com a finalidade de auxiliar os profissionais e ajudá-los a ampliar a sua participação individual na consolidação da autonomia financeira dos municípios. Há claramente avanços em matéria legislativa, mas é preciso aprovar um novo modelo, cuja aplicação irá trazer, seguramente, maiores benefícios para o país, porquanto reclama por um desenvolvimento mais equilibrado e promoverá uma equidade inter-geracional muito mais ampla.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A biplot, which is the multivariate generalization of the two-variable scatterplot, can be used to visualize the results of many multivariate techniques, especially those that are based on the singular value decomposition. We consider data sets consisting of continuous-scale measurements, their fuzzy coding and the biplots that visualize them, using a fuzzy version of multiple correspondence analysis. Of special interest is the way quality of fit of the biplot is measured, since it is well-known that regular (i.e., crisp) multiple correspondence analysis seriously under-estimates this measure. We show how the results of fuzzy multiple correspondence analysis can be defuzzified to obtain estimated values of the original data, and prove that this implies an orthogonal decomposition of variance. This permits a measure of fit to be calculated in the familiar form of a percentage of explained variance, which is directly comparable to the corresponding fit measure used in principal component analysis of the original data. The approach is motivated initially by its application to a simulated data set, showing how the fuzzy approach can lead to diagnosing nonlinear relationships, and finally it is applied to a real set of meteorological data.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Treatise on Quadrature of Fermat (c. 1659), besides containing the first known proof of the computation of the area under a higher parabola, R x+m/n dx, or under a higher hyperbola, R x-m/n dx with the appropriate limits of integration in each case , has a second part which was not understood by Fermat s contemporaries. This second part of the Treatise is obscure and difficult to read and even the great Huygens described it as'published with many mistakes and it is so obscure (with proofs redolent of error) that I have been unable to make any sense of it'. Far from the confusion that Huygens attributes to it, in this paper we try to prove that Fermat, in writing the Treatise, had a very clear goal in mind and he managed to attain it by means of a simple and original method. Fermat reduced the quadrature of a great number of algebraic curves to the quadrature of known curves: the higher parabolas and hyperbolas of the first part of the paper. Others, he reduced to the quadrature of the circle. We shall see how the clever use of two procedures, quite novel at the time: the change of variables and a particular case of the formulaof integration by parts, provide Fermat with the necessary tools to square very easily curves as well-known as the folium of Descartes, the cissoid of Diocles or the witch of Agnesi.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dubey and Geanakoplos [2002] have developed a theory of competitive pooling, which incorporates adverse selection and signaling into general equilibrium. By recasting the Rothschild-Stiglitz model of insurance in this framework, they find that a separating equilibrium always exists and is unique.We prove that their uniqueness result is not a consequence of the framework, but rather of their definition of refined equilibria. When other types of perturbations are used, the model allows for many pooling allocations to be supported as such: in particular, this is the case for pooling allocations that Pareto dominate the separating equilibrium.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate on-line prediction of individual sequences. Given a class of predictors, the goal is to predict as well as the best predictor in the class, where the loss is measured by the self information (logarithmic) loss function. The excess loss (regret) is closely related to the redundancy of the associated lossless universal code. Using Shtarkov's theorem and tools from empirical process theory, we prove a general upper bound on the best possible (minimax) regret. The bound depends on certain metric properties of the class of predictors. We apply the bound to both parametric and nonparametric classes ofpredictors. Finally, we point out a suboptimal behavior of the popular Bayesian weighted average algorithm.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper studies equilibria for economies characterized by moral hazard(hidden action), in which the set of contracts marketed in equilibrium isdetermined by the interaction of financial intermediaries.The crucial aspect of the environment that we study is thatintermediaries are restricted to trade non-exclusive contracts: theagents' contractual relationships with competing intermediaries cannot bemonitored (or are not contractible upon). We fully characterize equilibrium allocations and contracts. In thisset-up equilibrium allocations are clearly incentive constrainedinefficient. A robust property of equilibria with non-exclusivity isthat the contracts issued in equilibrium do not implement the optimalaction. Moreover we prove that, whenever equilibrium contracts doimplement the optimal action, intermediaries make positive profits andequilibrium allocations are third best inefficient (where the definitionof third best efficiency accounts for constraints which capture thenon-exclusivity of contracts).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study investigated concentrations of quetiapine and norquetiapine in plasma and cerebrospinal fluid (CSF) in 22 schizophrenic patients after 4-week treatment with quetiapine (600 mg/d), which was preceded by a 3-week washout period. Blood and CSF samples were obtained on days 1 and 28, and CSF levels of homovanillic acid (HVA), 5-hydroxyindoleacetic acid (5-HIAA), and 3-methoxy-4-hydroxyphenylglycol (MHPG) concentrations were measured at baseline and after 4 weeks of quetiapine, allowing calculations of differences in HVA (ΔHVA), 5-HIAA (Δ5-HIAA), and MHPG (ΔMHPG) concentrations. Patients were assessed clinically, using the Positive and Negative Syndrome Scale (PANSS) and Clinical Global Impression Scale at baseline and then at weekly intervals. Plasma levels of quetiapine and norquetiapine were 1110 ± 608 and 444 ± 226 ng/mL, and the corresponding CSF levels were 29 ± 18 and 5 ± 2 ng/mL, respectively. After the treatment, the levels of HVA, 5-HIAA, and MHPG were increased by 33%, 35%, and 33%, respectively (P < 0.001). A negative correlation was found between the decrease in PANSS positive subscale scores and CSF ΔHVA (r(rho) = -0.690, P < 0.01), and the decrease in PANSS negative subscale scores both with CSF Δ5-HIAA (r(rho) = -0.619, P = 0.02) and ΔMHPG (r(rho) = -0.484, P = 0.038). Because, unfortunately, schizophrenic patients experience relapses even with the best available treatments, monitoring of CSF drug and metabolite levels might prove to be useful in tailoring individually adjusted treatments.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Let there be a positive (exogenous) probability that, at each date, the human species will disappear.We postulate an Ethical Observer (EO) who maximizes intertemporal welfare under thisuncertainty, with expected-utility preferences. Various social welfare criteria entail alternativevon Neumann- Morgenstern utility functions for the EO: utilitarian, Rawlsian, and an extensionof the latter that corrects for the size of population. Our analysis covers, first, a cake-eating economy(without production), where the utilitarian and Rawlsian recommend the same allocation.Second, a productive economy with education and capital, where it turns out that the recommendationsof the two EOs are in general different. But when the utilitarian program diverges, thenwe prove it is optimal for the extended Rawlsian to ignore the uncertainty concerning the possibledisappearance of the human species in the future. We conclude by discussing the implicationsfor intergenerational welfare maximization in the presence of global warming.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Although Drosophila systemic immunity is extensively studied, little is known about the fly's intestine-specific responses to bacterial infection. Global gene expression analysis of Drosophila intestinal tissue to oral infection with the Gram-negative bacterium Erwinia carotovora revealed that immune responses in the gut are regulated by the Imd and JAK-STAT pathways, but not the Toll pathway. Ingestion of bacteria had a dramatic impact on the physiology of the gut that included modulation of stress response and increased stem cell proliferation and epithelial renewal. Our data suggest that gut homeostasis is maintained through a balance between cell damage due to the collateral effects of bacteria killing and epithelial repair by stem cell division. The Drosophila gut provides a powerful model to study the integration of stress and immunity with pathways associated with stem cell control, and this study should prove to be a useful resource for such further studies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

RESUME Dès le printemps 2004, la construction d'une 2ème ligne de métro est entreprise dans la ville de Lausanne en Suisse. En reliant Ouchy, au bord du lac Léman (alt. 373 m) à Epalinges (alt. 711 m), le nouveau métro "M2" traversera dès 2008 l'agglomération lausannoise du Sud au Nord sur une distance de 6 km. Depuis l'avant-projet, en 1999, une grande quantité de données géologiques a été récolté et de nombreux forages exécutés sur le site. Ceci nous a donné une occasion unique d'entreprendre une étude de microgravimétrique urbaine de détail. Le mode de creusement du tunnel dépend fortement des matériaux à excaver et il est classiquement du domaine du géologue, avec ses connaissances de la géologie régionale et de la stratigraphie des forages, de fournir à l'ingénieur un modèle géologique. Ce modèle indiquera dans ce cas l'épaisseur des terrains meubles qui recouvrent le soubassement rocheux. La représentativité spatiale d'une information très localisée, comme celle d'un forage, est d'autant plus compliquée que le détail recherché est petit. C'est à ce moment là que la prospection géophysique, plus spécialement gravimétrique, peut apporter des informations complémentaires déterminantes pour régionaliser les données ponctuelles des forages. La microgravimétrie en milieu urbain implique de corriger avec soin les perturbations gravifiques sur la mesure de la pesanteur dues aux effets de la topographie, des bâtiments et des caves afin d'isoler l'effet gravifique dû exclusivement à l'épaisseur du remplissage des terrains meubles. Tenant compte de l'intensité des corrections topographiques en milieu urbain, nous avons donné une grande importance aux sous-sols, leurs effets gravifiques pouvant atteindre l'ordre du dixième de mGal. Nous avons donc intégré ces corrections celle de topographie et traité les effets des bâtiments de manière indépendante. Nous avons inclus dans le modèle numérique de terrain (MNT) la chaussée et les sous-sols afin de construire un modèle numérique de terrain urbain. Nous utiliserons un nouvel acronyme « MNTU »pour décrire ce modèle. Nous proposons d'établir des cartes de corrections topographiques préalables, basées sur les données à disposition fournies par le cadastre en faisant des hypothèses sur la profondeur des sous-sols et la hauteur des bâtiments. Les deux zones de test choisies sont caractéristiques des différents types d'urbanisation présente à Lausanne et se révèlent par conséquent très intéressantes pour élaborer une méthodologie globale de la microgravimétrie urbaine. Le but était d'évaluer l'épaisseur du remplissage morainique sur un fond rocheux molassique se situant à une profondeur variable de quelques mètres à une trentaine de mètres et d'en établir une coupe dans l'axe du futur tracé du métro. Les résultats des modélisations se sont révélés très convaincants en détectant des zones qui diffèrent sensiblement du modèle géologique d'avant projet. Nous avons également démontré que l'application de cette méthode géophysique, non destructive, est à même de limiter le nombre de sondages mécaniques lors de l'avant-projet et du projet définitif, ce qui peut limiter à la fois les coûts et le dérangement engendré par ces travaux de surface. L'adaptabilité de la technique gravimétrique permet d'intervenir dans toutes les différentes phases d'un projet de génie civil comme celui de la construction d'un métro en souterrain. KURZFASSUNG Seit dem Frühling 2004 ist in der Stadt Lausanne (Schweiz) die neue U-Bahn "M2" in Konstruktion. Diese soll auf 6 km Länge die Lausanner Agglomeration von Süd nach Nord durchqueren. Die dem Projekt zu Grunde liegende technische Planung sieht vor, daß die Bahnlinie hauptsächlich in der Molasse angesiedelt sein wird. Seit dem Vorentwurf (1999) ist eine große Anzahl geologischer Angaben gesammelt worden. Daraus ergab sich die einmalige Gelegenheit, die Informationen aus den damit verbundenen zahlreichen Bohrungen zu einer detaillierten mikrogravimetrischen Studie der Stadt Lausanne zu erweitern und zu vervollständigen. Das Ziel bestand darin, die Mächtigkeit der die Molasseüberdeckenden Moräneablagerung abzuschätzen, um eine entsprechendes geologisches Profile entlang der künftigen Bahnlinie zu erstellen. Weiterhin sollte gezeigt werden, daß die Anwendung dieser nicht-invasiven geophysikalischen Methode es ermöglicht, die Anzahl der benötigten Bohrungen sowohl in der Pilotphase wie auch im endgültigen Projekt zu reduzieren, was zu wesentlichen finanziellen Einsparungen in der Ausführung des Werkes beitragen würde. Die beiden in dieser Studie bearbeiteten Testzonen befinden sich im Nordteil und im Stadtzentrum von Lausanne und sind durch eine unterschiedliche Urbanisierung charakterisiert. Das anstehende Gestein liegt in verschiedenen Tiefen: von einigen Metern bis zu etwa dreißig Metern. Diese Zonen weisen alle Schwierigkeiten einer urbanen Bebauung mit hoher Verkehrsdichte auf und waren daher massgebend bei der Ausarbeitung einer globalen mikrogravimetrischen Methodologie für die Stadt Lausanne. Die so entwickelte Technik ermöglicht, die störenden Auswirkungen der Topographie, der Gebäude, der Keller und der Öffentlichen Infrastrukturen sorgfältig zu korrigieren, um so die ausschließlich auf die Mächtigkeit des Lockergesteins zurückzuführenden Effekte zu isolieren. In Bezug auf die Intensität der Auswirkungen der topographischen Korrekturen im Stadtgebiet wurde den Untergeschossen eine besonders grosse Bedeutung zugemessen da die entsprechenden Schwerkrafteffekte eine Grösse von rund einem Zehntel mGal erreichen können. Wir schlagen deshalb vor, vorläufige Karten der topographischen Korrekturen zu erstellen. Diese Korrekturen basieren auf den uns vom Katasterplan gelieferten Daten und einigen Hypothesen bezüglich der Tiefe der Untergeschosse und der Höhe der Gebäude. Die Verfügbarkeit einer derartigen Karte vor der eigentlichen gravimetrischen Messkampagne würde uns erlauben, die Position der Meßstationen besser zu wählen. Wir sahen zudem, daß ein entsprechenden a priori Filter benutzt werden kann, wenn die Form und die Intensität der Anomalie offensichtlich dem entsprechenden Gebäude zugeordnet werden können. Diese Strategie muß jedoch mit Vorsicht angewandt werden, denn falls weitere Anomalien dazukommen, können bedeutende Verschiebungen durch Übèrlagerungen der Schwerewirkung verschiedener Strukturen entstehen. Die Ergebnisse der Modellierung haben sich als sehr überzeugend erwiesen, da sie im Voraus unbekannte sensible Zonen korrekt identifiziert haben. Die Anwendbarkeit der in dieser Arbeit entwickelten gravimetrischen Technik ermöglicht es, während allen Phasen eines Grossbauprojekts, wie zum Beispiel bei der Konstruktion einer unterirdischen U-Bahn, einzugreifen. ABSTRACT Since Spring of 2004 a new metro line has been under construction in the city of Lausanne in Switzerland. The new line, the M2, will be 6 km long and will traverse the city from south to north. The civil engineering project determined that the line would be located primarily in the Molasse. Since the preparatory project in 1999, a great quantity of geological data has been collected, and the many drillings made on the site have proved to be a unique opportunity to undertake a study of urban microgravimetry. The goal was to evaluate the thickness of the morainic filling over the molassic bedrock, and to establish a section along the axis of the future line. It then had to be shown that the application of this nondestructive geophysical method could reduce the number of mechanical surveys required both for a preparatory and a definitive project, which would lead to real savings in the realization of a civil engineering project. The two test zones chosen, one in the northern part of the city and one in the city centre, are characterised by various types of urbanisation. Bedrock is at a depth varying from a few metres to about thirty metres. These zones well exemplify the various difficulties encountered in an urban environment and are therefore very interesting for the development of an overall methodology of urban microgravimetry. Microgravimetry in an urban environment requires careful corrections for gravific disturbances due to the effects of topography, buildings, cellars, and the infrastructure of distribution networks, in order to isolate the gravific effect due exclusively to the thickness of loose soil filling. Bearing in mind the intensity of the topographic corrections in an urban environment, we gave particular importance to basements. Their gravific effects can reach the order of one tenth of one meal, and can influence above all the precision of the Bouguer anomaly. We propose to establish preliminary topographic correction charts based on data provided to us by the land register, by making assumptions on the depths of basements and the heights of buildings. Availability of this chart previous to a gravimetry campaign would enable us to choose optimum measuring sites. We have also seen that an a priori filter can be used when the form and the intensity of the anomaly correspond visually to the corresponding building. This strategy must be used with caution because if other anomalies are to be associated, important shifts can be generated by the superposition of the effects of different structures. The results of the model have proved to be very convincing in detecting previously unknown sensitive zones. The adaptability of the gravimetry technique allows for application in all phases of a civil engineering project such as the construction of an underground metro line. RIASSUNTO Dalla primavera 2004 una nuova linea metropolitana é in costruzione nella città di Losanna in Svizzera. La nuova metropolitana "M2" traverserà per la lunghezza di 6 km il centro urbano di Losanna da sud a nord. II progetto d'ingegneria civile prevedeva un tracciato situato essenzialmente nel fondo roccioso arenaceo terziario (molassa). Dalla redazione del progetto preliminare, avvenuta nel 1999, una grande quantità di dati geologici sono stati raccolti e sono stati eseguiti numerosi sondaggi. Questo sì é presentato come un'occasione unica per mettere a punto uno studio microgravimetrico in ambiente urbano con lo scopo di valutare lo spessore dei terreni sciolti di origine glaciale che ricoprono il fondo roccioso di molassa e di mettere in evidenza come l'applicazione di questo metodo geofisico non distruttivo possa limitare il numero di sondaggi meccanici nella fase di progetto preliminare ed esecutivo con conseguente reale risparmio economico nella realizzazione di una tale opera. Le due zone di test sono situate una nella zona nord e la seconda nel centro storico di Losanna e sono caratterizzate da stili architettonici differenti. II fondo roccioso é situato ad una profondità variabile da qualche metro ad una trentina. Queste due zone sembrano ben rappresentare tutte le difficoltà di un ambiente urbano e ben si prestano per elaborare una metodologia globale per la microgravimetria in ambiente urbano. L'applicazione di questa tecnica nell'ambiente suddetto implica la correzione attenta delle perturbazioni sulla misura dell'accelerazione gravitazionale, causate dalla topografia, gli edifici, le cantine e le infrastrutture dei sottoservizi, per ben isolare il segnale esclusivamente causato dallo spessore dei terreni sciolti. Tenuto conto, dell'intensità delle correzioni topografiche, abbiamo dato grande importanza alle cantine, poiché il loro effetto sulle misure può raggiungere il decimo di mGal. Proponiamo quindi di redigere una carta delle correzioni topografiche preliminare all'acquisizione, facendo delle ipotesi sulla profondità delle cantine e sull'altezza degli edifici, sulla base delle planimetrie catastali. L'analisi di questa carta permetterà di scegliere le posizioni più adatte per le stazioni gravimetriche. Abbiamo anche osservato che un filtro a priori, qualora la forma e l'intensità dell'anomalia fosse facilmente riconducibile in maniera visuale ad un edificio, possa essere efficace. Tuttavia questa strategia deve essere utilizzata con precauzione, poiché può introdurre uno scarto, qualora più anomalie, dovute a differenti strutture, si sovrappongano. I risultati delle modellizzazioni si sono rivelati convincenti, evidenziando zone sensibili non conosciute preventivamente. L'adattabilità della tecnica gravimetrica ha mostrato di poter intervenire in differenti fasi di un progetto di ingegneria civile, quale è quella di un'opera in sotterraneo.