832 resultados para Guarantees


Relevância:

10.00% 10.00%

Publicador:

Resumo:

At the light of what happened in 2010 and 2011, a lot of European countries founded themselves in a difficult position where all the credit rating agencies were downgrading debt states. Problem of solvency and guarantees on the states' bond were perceived as too risky for a Monetary Union as Europe is. Fear of a contagion from Greece as well was threatening the other countries as Italy, Spain, Portugal and Ireland; while Germany and France asked for a division between risky and riskless bond in order to feel more safe. Our paper gets inspiration by Roch and Uhlig (2011), it refers to the Argentinian case examined by Arellano (2008) and examine possible interventions as monetization or bailout as proposed by Cole and Kehoe (2000). We propose a model in which a state defaults and cannot repay a fraction of the old bond; but contrary to Roch and Uhlig that where considering a one-time cost of default we consider default as an accumulation of losses, perceived as unpaid fractions of the old debts. Our contributions to literature is that default immediately imply that economy faces a bad period and, accumulating losses, government will be worse-off. We studied a function for this accumulation of debt period by period, in order to get an idea of the magnitude of this waste of resources that economy will face when experiences a default. Our thesis is that bailouts just postpone the day of reckoning (Roch, Uhlig); so it's better to default before accumulate a lot of debts. What Europe need now is the introduction of new reforms in a controlled default where the Eurozone will be saved in its whole integrity and a state could fail with the future promise of a resurrection. As experience show us, governments are not interested into reducing debts since there are ECB interventions. That clearly create a distortion between countries in the same monetary union, giving to the states just an illusion about their future debtor position.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis deal with the design of advanced OFDM systems. Both waveform and receiver design have been treated. The main scope of the Thesis is to study, create, and propose, ideas and novel design solutions able to cope with the weaknesses and crucial aspects of modern OFDM systems. Starting from the the transmitter side, the problem represented by low resilience to non-linear distortion has been assessed. A novel technique that considerably reduces the Peak-to-Average Power Ratio (PAPR) yielding a quasi constant signal envelope in the time domain (PAPR close to 1 dB) has been proposed.The proposed technique, named Rotation Invariant Subcarrier Mapping (RISM),is a novel scheme for subcarriers data mapping,where the symbols belonging to the modulation alphabet are not anchored, but maintain some degrees of freedom. In other words, a bit tuple is not mapped on a single point, rather it is mapped onto a geometrical locus, which is totally or partially rotation invariant. The final positions of the transmitted complex symbols are chosen by an iterative optimization process in order to minimize the PAPR of the resulting OFDM symbol. Numerical results confirm that RISM makes OFDM usable even in severe non-linear channels. Another well known problem which has been tackled is the vulnerability to synchronization errors. Indeed in OFDM system an accurate recovery of carrier frequency and symbol timing is crucial for the proper demodulation of the received packets. In general, timing and frequency synchronization is performed in two separate phases called PRE-FFT and POST-FFT synchronization. Regarding the PRE-FFT phase, a novel joint symbol timing and carrier frequency synchronization algorithm has been presented. The proposed algorithm is characterized by a very low hardware complexity, and, at the same time, it guarantees very good performance in in both AWGN and multipath channels. Regarding the POST-FFT phase, a novel approach for both pilot structure and receiver design has been presented. In particular, a novel pilot pattern has been introduced in order to minimize the occurrence of overlaps between two pattern shifted replicas. This allows to replace conventional pilots with nulls in the frequency domain, introducing the so called Silent Pilots. As a result, the optimal receiver turns out to be very robust against severe Rayleigh fading multipath and characterized by low complexity. Performance of this approach has been analytically and numerically evaluated. Comparing the proposed approach with state of the art alternatives, in both AWGN and multipath fading channels, considerable performance improvements have been obtained. The crucial problem of channel estimation has been thoroughly investigated, with particular emphasis on the decimation of the Channel Impulse Response (CIR) through the selection of the Most Significant Samples (MSSs). In this contest our contribution is twofold, from the theoretical side, we derived lower bounds on the estimation mean-square error (MSE) performance for any MSS selection strategy,from the receiver design we proposed novel MSS selection strategies which have been shown to approach these MSE lower bounds, and outperformed the state-of-the-art alternatives. Finally, the possibility of using of Single Carrier Frequency Division Multiple Access (SC-FDMA) in the Broadband Satellite Return Channel has been assessed. Notably, SC-FDMA is able to improve the physical layer spectral efficiency with respect to single carrier systems, which have been used so far in the Return Channel Satellite (RCS) standards. However, it requires a strict synchronization and it is also sensitive to phase noise of local radio frequency oscillators. For this reason, an effective pilot tone arrangement within the SC-FDMA frame, and a novel Joint Multi-User (JMU) estimation method for the SC-FDMA, has been proposed. As shown by numerical results, the proposed scheme manages to satisfy strict synchronization requirements and to guarantee a proper demodulation of the received signal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Italy and France in Trianon’s Hungary: two political and cultural penetration models During the first post-war, the Danubian Europe was the theatre of an Italian-French diplomatic challenge to gain hegemony in that part of the continent. Because of his geographical position, Hungary had a decisive strategic importance for the ambitions of French and Italian foreign politics. Since in the 1920s culture and propaganda became the fourth dimension of international relations, Rome and Paris developed their diplomatic action in Hungary to affirm not only political and economic influence, but also cultural supremacy. In the 1930, after Hitler’s rise to power, the unstoppable comeback of German political influence in central-eastern Europe determined the progressive decline of Italian and French political and economic positions in Hungary: only the cultural field allowed a survey of Italian-Hungarian and French-Hungarian relations in the contest of a Europe dominated by Nazi Germany during the Second World War. Nevertheless, the radical geopolitical changes in second post-war Europe did not compromise Italian and French cultural presence in the new communist Hungary. Although cultural diplomacy is originally motivated by contingent political targets, it doesn’t respect the short time of politics, but it’s the only foreign politics tool that guarantees preservations of bilateral relations in the long run.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Cost, performance and availability considerations are forcing even the most conservative high-integrity embedded real-time systems industry to migrate from simple hardware processors to ones equipped with caches and other acceleration features. This migration disrupts the practices and solutions that industry had developed and consolidated over the years to perform timing analysis. Industry that are confident with the efficiency/effectiveness of their verification and validation processes for old-generation processors, do not have sufficient insight on the effects of the migration to cache-equipped processors. Caches are perceived as an additional source of complexity, which has potential for shattering the guarantees of cost- and schedule-constrained qualification of their systems. The current industrial approach to timing analysis is ill-equipped to cope with the variability incurred by caches. Conversely, the application of advanced WCET analysis techniques on real-world industrial software, developed without analysability in mind, is hardly feasible. We propose a development approach aimed at minimising the cache jitters, as well as at enabling the application of advanced WCET analysis techniques to industrial systems. Our approach builds on:(i) identification of those software constructs that may impede or complicate timing analysis in industrial-scale systems; (ii) elaboration of practical means, under the model-driven engineering (MDE) paradigm, to enforce the automated generation of software that is analyzable by construction; (iii) implementation of a layout optimisation method to remove cache jitters stemming from the software layout in memory, with the intent of facilitating incremental software development, which is of high strategic interest to industry. The integration of those constituents in a structured approach to timing analysis achieves two interesting properties: the resulting software is analysable from the earliest releases onwards - as opposed to becoming so only when the system is final - and more easily amenable to advanced timing analysis by construction, regardless of the system scale and complexity.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lo studio svolto in merito alle tecniche di produzione di componenti strutturali in materiale composito ha permesso il raggiungimento di una precisa consapevolezza dello stato dell’arte del settore, in particolare in riferimento ai processi attualmente utilizzati per l’industrializzazione in media-grande serie. Con l’obiettivo di sintetizzare i principali vantaggi delle tecnologie suddette e permettere la realizzazione di forme più complesse, si è proceduto all’analisi di fattibilità, attraverso uno studio funzionale e una prima progettazione di una tecnologia di produzione per nastratura automatizzata di componenti strutturali in materiale composito. Si è voluto quindi dimostrare la flessibilità e la consistenza del processo disegnando un telaio nastrato in carbonio, intercambiabile al telaio FSAE 2009 in tubolare d’acciaio (stessi punti di attacco motore, punti di attacco telaietto posteriore, attacchi sospensioni anteriori) e che garantisca un sostanziale vantaggio in termini di peso, a pari rigidezza torsionale. La caratterizzazione di tale telaio è stata eseguita mediante l'utilizzo del calcolo strutturale, validato da prove sperimentali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Il progetto di ricerca si situa nell’ambito dell’informatica giudiziaria settore che studia i sistemi informativi implementati negli uffici giudiziari allo scopo di migliorare l’efficienza del servizio, fornire una leva per la riduzione dei lunghi tempi processuali, al fine ultimo di garantire al meglio i diritti riconosciuti ai cittadini e accrescere la competitività del Paese. Oggetto di studio specifico del progetto di ricerca è l’utilizzo delle ICT nel processo penale. Si tratta di una realtà meno studiata rispetto al processo civile, eppure la crisi di efficienza del processo non è meno sentita in tale area: l’arretrato da smaltire al 30 giugno del 2011 è stato quantificato in 3,4 milioni di processi penali, e il tempo medio di definizione degli stessi è di quattro anni e nove mesi. Guardare al processo penale con gli occhi della progettazione dei sistemi informativi è vedere un fluire ininterrotto di informazioni che include realtà collocate a monte e a valle del processo stesso: dalla trasmissione della notizia di reato alla esecuzione della pena. In questa prospettiva diventa evidente l’importanza di una corretta gestione delle informazioni: la quantità, l’accuratezza, la rapidità di accesso alle stesse sono fattori così cruciali per il processo penale che l’efficienza del sistema informativo e la qualità della giustizia erogata sono fortemente interrelate. Il progetto di ricerca è orientato a individuare quali siano le condizioni in cui l’efficienza può essere effettivamente raggiunta e, soprattutto, a verificare quali siano le scelte tecnologiche che possono preservare, o anche potenziare, i principi e le garanzie del processo penale. Nel processo penale, infatti, sono coinvolti diritti fondamentali dell’individuo quali la libertà personale, la dignità, la riservatezza, diritti fondamentali che vengono tutelati attraverso un ampia gamma di diritti processuali quali la presunzione di innocenza, il diritto di difesa, il diritto al contraddittorio, la finalità di rieducazione della pena.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lattice Boltzmann method is a popular approach for simulating hydrodynamic interactions in soft matter and complex fluids. The solvent is represented on a discrete lattice whose nodes are populated by particle distributions that propagate on the discrete links between the nodes and undergo local collisions. On large length and time scales, the microdynamics leads to a hydrodynamic flow field that satisfies the Navier-Stokes equation. In this thesis, several extensions to the lattice Boltzmann method are developed. In complex fluids, for example suspensions, Brownian motion of the solutes is of paramount importance. However, it can not be simulated with the original lattice Boltzmann method because the dynamics is completely deterministic. It is possible, though, to introduce thermal fluctuations in order to reproduce the equations of fluctuating hydrodynamics. In this work, a generalized lattice gas model is used to systematically derive the fluctuating lattice Boltzmann equation from statistical mechanics principles. The stochastic part of the dynamics is interpreted as a Monte Carlo process, which is then required to satisfy the condition of detailed balance. This leads to an expression for the thermal fluctuations which implies that it is essential to thermalize all degrees of freedom of the system, including the kinetic modes. The new formalism guarantees that the fluctuating lattice Boltzmann equation is simultaneously consistent with both fluctuating hydrodynamics and statistical mechanics. This establishes a foundation for future extensions, such as the treatment of multi-phase and thermal flows. An important range of applications for the lattice Boltzmann method is formed by microfluidics. Fostered by the "lab-on-a-chip" paradigm, there is an increasing need for computer simulations which are able to complement the achievements of theory and experiment. Microfluidic systems are characterized by a large surface-to-volume ratio and, therefore, boundary conditions are of special relevance. On the microscale, the standard no-slip boundary condition used in hydrodynamics has to be replaced by a slip boundary condition. In this work, a boundary condition for lattice Boltzmann is constructed that allows the slip length to be tuned by a single model parameter. Furthermore, a conceptually new approach for constructing boundary conditions is explored, where the reduced symmetry at the boundary is explicitly incorporated into the lattice model. The lattice Boltzmann method is systematically extended to the reduced symmetry model. In the case of a Poiseuille flow in a plane channel, it is shown that a special choice of the collision operator is required to reproduce the correct flow profile. This systematic approach sheds light on the consequences of the reduced symmetry at the boundary and leads to a deeper understanding of boundary conditions in the lattice Boltzmann method. This can help to develop improved boundary conditions that lead to more accurate simulation results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La fotografia viene utilizzata intermedialmente per la narrazione di contromemorie e memorie traumatiche ricorrendo a numerose modalità e strategie di inserzione e impiego diverse. Se l’intermedialità da un lato non è riconducibile ad una serie di pratiche convenzionali, ma dipende dal contesto narrativo, dall’altro essa detiene un’organicità che la allinea funzionalmente ai processi e alle indagini sulla rappresentabilità del trauma. Inoltre, per la versatilità della sua natura poliedrica, la pratica narrativa intermediale (nelle sue configurazioni più diverse) assume una valenza epistemologica e metodologica nei confronti degli studi sull’esternazione e rielaborazione del trauma. Questo studio si prefigge di mettere a confronto testi teorici e testi narrativi per metterne in rilievo il reciproco apporto.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La presente ricerca mira ad individuare e risolvere alcuni problemi di inquadramento e di disciplina applicabile in ordine all’istituto regolato dall’art. 8 della legge n. 40/2007, con successive modificazioni ed integrazioni, definito a livello normativo come «portabilità del mutuo». In particolare, ci si è chiesti come la nuova normativa in tema di trasferibilità del mutuo possa inserirsi all’interno della disciplina della surrogazione se quest’ultima non venga considerata come possibile strumento di circolazione del credito e se ci si possa spingere fino a considerare l’art. 8 come una riscrittura moderna dell’istituto codicistico. Sebbene l’art. 8 non sia stato limitato ai finanziamenti ipotecari, tali istituti costituiscono il principale ambito di applicazione della normativa. Per questa ragione si è sostenuto che la disposizione, più che la «portabilità del mutuo», avrebbe lo scopo di incentivare la «portabilità dell’ipoteca», intendendosi quest’ultima come la surrogazione del nuovo finanziatore nel credito ipotecario, ovvero più specificamente nell’ipoteca, ai sensi dell’art. 1202 c.c. Lo studio dei riflessi della surrogazione, così come prevista dalla legge del 2007, sulle garanzie in generale e sull’ipoteca in particolare, ha mostrato come il legislatore, tramite l’introduzione di una disciplina semplificata, abbia inteso adeguare gli istituti giuridici tradizionali alle esigenze pratiche di flessibilità del mercato del credito; ciò tuttavia con scarso successo e lasciando aperti taluni dubbi interpretativi. Al fine di approfondire la ricerca, si è affrontata la materia oggetto di studio in un’ottica comparata, rilevando quali siano a livello europeo le principali differenze in tema di circolazione del credito, portabilità del mutuo e trasferibilità delle garanzie.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we studied the efficiency of the benchmarks used in the asset management industry. In chapter 2 we analyzed the efficiency of the benchmark used for the government bond markets. We found that for the Emerging Market Bonds an equally weighted index for the country weights is probably the more suited because guarantees maximum diversification of country risk but for the Eurozone government bond market we found a GDP weighted index is better because the most important matter is to avoid a higher weight for highly indebted countries. In chapter 3 we analyzed the efficiency of a Derivatives Index to invest in the European corporate bond market instead of a Cash Index. We can state that the two indexes are similar in terms of returns, but that the Derivatives Index is less risky because it has a lower volatility, has values of skewness and kurtosis closer to those of a normal distribution and is a more liquid instrument, as the autocorrelation is not significant. In chapter 4 it is analyzed the impact of fallen angels on the corporate bond portfolios. Our analysis investigated the impact of the month-end rebalancing of the ML Emu Non Financial Corporate Index for the exit of downgraded bond (the event). We can conclude a flexible approach to the month-end rebalancing is better in order to avoid a loss of valued due to the benchmark construction rules. In chapter 5 we did a comparison between the equally weighted and capitalization weighted method for the European equity market. The benefit which results from reweighting the portfolio into equal weights can be attributed to the fact that EW portfolios implicitly follow a contrarian investment strategy, because they mechanically rebalance away from stocks that increase in price.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis, the industrial application of control a Permanent Magnet Synchronous Motor in a sensorless configuration has been faced, and in particular the task of estimating the unknown “parameters” necessary for the application of standard motor control algorithms. In literature several techniques have been proposed to cope with this task, among them the technique based on model-based nonlinear observer has been followed. The hypothesis of neglecting the mechanical dynamics from the motor model has been applied due to practical and physical considerations, therefore only the electromagnetic dynamics has been used for the observers design. First observer proposed is based on stator currents and Stator Flux dynamics described in a generic rotating reference frame. Stator flux dynamics are known apart their initial conditions which are estimated, with speed that is also unknown, through the use of the Adaptive Theory. The second observer proposed is based on stator currents and Rotor Flux dynamics described in a self-aligning reference frame. Rotor flux dynamics are described in the stationary reference frame exploiting polar coordinates instead of classical Cartesian coordinates, by means the estimation of amplitude and speed of the rotor flux. The stability proof is derived in a Singular Perturbation Framework, which allows for the use the current estimation errors as a measure of rotor flux estimation errors. The stability properties has been derived using a specific theory for systems with time scale separation, which guarantees a semi-global practical stability. For the two observer ideal simulations and real simulations have been performed to prove the effectiveness of the observers proposed, real simulations on which the effects of the Inverter nonlinearities have been introduced, showing the already known problems of the model-based observers for low speed applications.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Kohlenhydrate wurden bislang nur selten zur Darstellung chiraler Liganden verwendet. Sie gelten als zu polyfunktionell und konformativ zu flexibel, um daraus mit vertretbarem Aufwand Liganden zu synthetisieren, die die Anforderungen an ein leistungsfähiges Katalysatorsystem - die spezifische Komplexierung des Metalls in einer konformativ möglichst rigiden Umgebung - erfüllen.rnDas Element der planaren Chiralität erwies sich in vielen asymmetrischen, katalytischen Prozessen als entscheidend für die Erzielung hoher Enantioselektivitäten.rnDie vorliegende Arbeit baut auf den Kohlenhydratliganden-Synthesen mit Glycosylaminen auf, die über geeignete komplexierende Zentren verfügen, um damit andere als die bisher mit Kohlenhydraten bekannten enantioselektiven Katalysen durchführen zu können. Zur Synthese stickstoffhaltiger chiraler Verbindungen haben sich besonders perpivaloylierte Glycosylamine vom Typ des 2,3,4,6-Tetra-O-pivaloyl-β-D-galactopyranosylamins bewährt. Im Rahmen dieser Dissertation wurden Schiff-Basen aus pivaloyliertem Galactosylamin bzw. verschiedenen anderen Galactosylamin-Bausteinen als chiralem Rückgrat, und einem Aldehyd auf der Basis von planar chiralem [2.2]Paracyclophan dargestellt. Die neuen N-Galactosylimine wurden außerdem in asymmetrischen Ugi-Reaktionen und in Tandem Mannich-Michael-Reaktionen zu N-Galactosyl-dehydropiperidinonen untersucht. Bei der Spaltung der dargestellten N-Galactosylimine von Paracyclophan-aldehyden unter mineralsauren Bedingungen sollten die entsprechenden mono- und di-substituierten Formyl- [2.2]paracyclophane in enantiomerenreiner Form erhalten werden. Die erhaltenen Verbindungen wurden als potentielle N,O-Liganden in der asymmetrischen Strecker Reaktion, in die enantioselektiven Epoxidierungen und in der Addition von Diethylzink an aromatische und aliphatische Aldehyde untersucht.rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

ZusammenfassungrnrnrnDer Köderstreifentest, die Auswertung der Minicontainer und die Erfassung der Bodenlebewesen mit Hilfe der Bodenstechkerne ergeben zusammen eine gut standardisierte Methode zur Darstellung und Beurteilung der Mesofauna. Unter der Vorraussetzung gleicher abiotischer Faktoren ist es problemlos möglich, selbst unterschiedliche Standorte wie Agrarflächen, Weinberge und Waldböden vergleichend zu untersuchen.rnrnAuf den verschiedenen Versuchsflächen des Laubenheimer Weinberges gelingt es deutlich zu zeigen, wie wichtig eine naturnahe Begrünung für den Boden ist. Dies betrifft nicht nur die Entwicklung der Humusschicht und damit die Bodenlebewesen, sondern auch die Schaffung von Kapillaren und Poren, die durch schwere landwirtschaftliche Geräte im Rahmen der Bo-denverdichtung reduziert werden. Erosionserscheinungen kommen vollständig zum Stillstand. Das Ökosystem Boden sollte auch so gut wie keine Belastung durch Herbizide, Insektizide und Pestizide erfahren. Ähnliches gilt auch für agrarisch genutzte Flächen. rnrnDer Lennebergwald als Naherholungsregion von Mainz ist besonders schützenswert, da dieser durch intensiven Immissionseintrag aufgrund der Nähe zu den Autobahnen und durch die Eutrophierung über die Haustiere stark belastet wird. Die immer größere Ausdehnung des Siedlungsgebietes und die damit verbundene steigende Anzahl an Waldbesuchern, die durch Verlassen der vorgegebenen Wege den Boden zerstören, gefährden zusätzlich das Ökosystem.rnrnÜber Sinn und Zweck einer Flurbereinigung zu diskutieren ist hier nicht angebracht. Aus bo-denkundlicher Sicht ist sie nicht zu befürworten, da hiermit alle bodenbewahrenden Maßnah-men ignoriert werden. Wichtig ist es, bei den Landwirten Aufklärungsarbeit zu leisten, was bodenschonende und bodenweiterentwickelnde Bearbeitungsmethoden bedeuten. Mit Hilfe sachgemäßer Aufklärung und richtiger Umsetzung kann durch Begrünungsmaßnahmen der zum Teil sehr stark strapazierte Boden erhalten, gefördert und auf lange Sicht stabilisiert wer-den.rnrnAufgrund der festgestellten Tatsachen wurde ab 2008 auf eine flächige Dauerbegrünung um-gestellt, so dass es auch in den unbegrünten Rebzeilen zu einer Bodenverbesserung kommen kann. Mit großer Wahrscheinlichkeit dürfte diese schneller voranschreiten, da die Mesofauna von den benachbarten begrünten Rebzeilen einwandern kann. rnDie Mesofauna landwirtschaftlich genutzter Flächen und Waldgebiete kann, obwohl extrem unterschiedlich, miteinander verglichen werden.rnrnBrachflächen und Waldgebiete lassen sich aufgrund der unberührten Bodenstrukturen sogar gut miteinander vergleichen. Temperatur- und Niederschlagsverhältnisse müssen dabei über-einstimmen. Die Azidität der jeweiligen Böden gilt es zu berücksichtigen, da verschiedene Tiergruppen damit unterschiedlich umgehen. Collembolen bevorzugen neutrale Böden, wäh-rend Acari als Räuber mit den Lebewesen in sauren Böden besser zurechtkommen. Die Streu-auflage ist dabei von großer Bedeutung.rnrnIm Rahmen von Bearbeitungsmaßnahmen kommt es durch jeglichen Maschineneinsatz zu ei-ner mehr oder weniger starken Veränderung der Bodenstruktur und somit auch der darin le-benden Mesofauna. Bis sich diese erholt hat, steht meist schon die nächste Bodenbewirtschaf-tung an. Die Bodenverdichtung spielt auch eine Rolle. Bei herkömmlichem Ackerbau ist eine Fruchtfolge mit eingeschalteter Brache oder Gründüngung mit Klee oder Luzerne angebracht, um die Mesofauna nicht zu stark zu strapazieren. Organische Düngegaben leicht abbaubarer Streu sind deutlich zu bevorzugen gegenüber sehr zellulose- und ligninhaltigen Pflanzenresten. Die Einbringung von Stoppeln nach Aberntung von Getreidefeldern ist sinnvoll, solange dabei nicht zu tief in die Bodenstruktur eingegriffen wird (ZIMMER 1997).rnrnIm Rahmen der Sonderkultur Wein, bei der eine Bodenbearbeitung aus den aufgezeigten Gründen eigentlich nicht notwendig wäre, sind Dauerbegrünungsmaßnahmen generell von Nutzen: der Erosion wird vorgebeugt, die Bodenfeuchte konstant gehalten, der anfallende Mulch als Gründüngung genutzt. Dies sind alles entscheidende Faktoren, die die Meso- und Makrofauna fördern. Nur die Bodenverdichtung durch schweres Gerät, wie Schlepper und Vollernter, sind für den Boden nicht förderlich (HEISLER 1993, EHRENSBERGER 1993). Nie-derdruckreifen und Verringerung der Befahrung sind geeignete Gegenmaßnahmen. rnrnEntgegen landläufiger Winzermeinung, stellen die Pflanzen einer Begrünung eigentlich keine Konkurrenz für die Weinstöcke dar. Die Vorteile einer Begrünung sind nicht nur die Förde-rung der einheimischen Flora in ihrem standortgerechten Artenreichtum, sondern auch Ver-vielfältigung von Meso- und Makrofauna aufgrund der dadurch mehr anfallenden und ein-zuarbeitenden leicht abbaubaren Streu (GRIEBEL 1995).rn

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Come risposta positiva alle richieste provenienti dal mondo dei giuristi, spesso troppo distante da quello scientifico, si vuole sviluppare un sistema solido dal punto di vista tecnico e chiaro dal punto di vista giurico finalizzato ad migliore ricerca della verità. L’obiettivo ci si prefigge è quello di creare uno strumento versatile e di facile utilizzo da mettere a disposizione dell’A.G. ed eventualmente della P.G. operante finalizzato a consentire il proseguo dell’attività d’indagine in tempi molto rapidi e con un notevole contenimento dei costi di giustizia rispetto ad una normale CTU. La progetto verterà su analisi informatiche forensi di supporti digitali inerenti vari tipi di procedimento per cui si dovrebbe richiedere una CTU o una perizia. La sperimentazione scientifica prevede un sistema di partecipazione diretta della P.G. e della A.G. all’analisi informatica rendendo disponibile, sottoforma di macchina virtuale, il contenuto dei supporti sequestrati in modo che possa essere visionato alla pari del supporto originale. In questo modo il CT diventa una mera guida per la PG e l’AG nell’ambito dell’indagine informatica forense che accompagna il giudice e le parti alla migliore comprensione delle informazioni richieste dal quesito. Le fasi chiave della sperimentazione sono: • la ripetibilità delle operazioni svolte • dettare delle chiare linee guida per la catena di custodia dalla presa in carico dei supporti • i metodi di conservazione e trasmissione dei dati tali da poter garantire integrità e riservatezza degli stessi • tempi e costi ridotti rispetto alle normali CTU/perizie • visualizzazione diretta dei contenuti dei supporti analizzati delle Parti e del Giudice circoscritte alle informazioni utili ai fini di giustizia

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The wide diffusion of cheap, small, and portable sensors integrated in an unprecedented large variety of devices and the availability of almost ubiquitous Internet connectivity make it possible to collect an unprecedented amount of real time information about the environment we live in. These data streams, if properly and timely analyzed, can be exploited to build new intelligent and pervasive services that have the potential of improving people's quality of life in a variety of cross concerning domains such as entertainment, health-care, or energy management. The large heterogeneity of application domains, however, calls for a middleware-level infrastructure that can effectively support their different quality requirements. In this thesis we study the challenges related to the provisioning of differentiated quality-of-service (QoS) during the processing of data streams produced in pervasive environments. We analyze the trade-offs between guaranteed quality, cost, and scalability in streams distribution and processing by surveying existing state-of-the-art solutions and identifying and exploring their weaknesses. We propose an original model for QoS-centric distributed stream processing in data centers and we present Quasit, its prototype implementation offering a scalable and extensible platform that can be used by researchers to implement and validate novel QoS-enforcement mechanisms. To support our study, we also explore an original class of weaker quality guarantees that can reduce costs when application semantics do not require strict quality enforcement. We validate the effectiveness of this idea in a practical use-case scenario that investigates partial fault-tolerance policies in stream processing by performing a large experimental study on the prototype of our novel LAAR dynamic replication technique. Our modeling, prototyping, and experimental work demonstrates that, by providing data distribution and processing middleware with application-level knowledge of the different quality requirements associated to different pervasive data flows, it is possible to improve system scalability while reducing costs.