798 resultados para Penalty Clause


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Die Verifikation bewertet die Güte von quantitativen Niederschlagsvorhersagen(QNV) gegenüber Beobachtungen und liefert Hinweise auf systematische Modellfehler. Mit Hilfe der merkmals-bezogenen Technik SAL werden simulierte Niederschlagsverteilungen hinsichtlich (S)truktur, (A)mplitude und (L)ocation analysiert. Seit einigen Jahren werden numerische Wettervorhersagemodelle benutzt, mit Gitterpunktabständen, die es erlauben, hochreichende Konvektion ohne Parametrisierung zu simulieren. Es stellt sich jetzt die Frage, ob diese Modelle bessere Vorhersagen liefern. Der hoch aufgelöste stündliche Beobachtungsdatensatz, der in dieser Arbeit verwendet wird, ist eine Kombination von Radar- und Stationsmessungen. Zum einem wird damit am Beispiel der deutschen COSMO-Modelle gezeigt, dass die Modelle der neuesten Generation eine bessere Simulation des mittleren Tagesgangs aufweisen, wenn auch mit zu geringen Maximum und etwas zu spätem Auftreten. Im Gegensatz dazu liefern die Modelle der alten Generation ein zu starkes Maximum, welches erheblich zu früh auftritt. Zum anderen wird mit dem neuartigen Modell eine bessere Simulation der räumlichen Verteilung des Niederschlags, durch eine deutliche Minimierung der Luv-/Lee Proble-matik, erreicht. Um diese subjektiven Bewertungen zu quantifizieren, wurden tägliche QNVs von vier Modellen für Deutschland in einem Achtjahreszeitraum durch SAL sowie klassischen Maßen untersucht. Die höher aufgelösten Modelle simulieren realistischere Niederschlagsverteilungen(besser in S), aber bei den anderen Komponenten tritt kaum ein Unterschied auf. Ein weiterer Aspekt ist, dass das Modell mit der gröbsten Auf-lösung(ECMWF) durch den RMSE deutlich am besten bewertet wird. Darin zeigt sich das Problem des ‚Double Penalty’. Die Zusammenfassung der drei Komponenten von SAL liefert das Resultat, dass vor allem im Sommer das am feinsten aufgelöste Modell (COSMO-DE) am besten abschneidet. Hauptsächlich kommt das durch eine realistischere Struktur zustande, so dass SAL hilfreiche Informationen liefert und die subjektive Bewertung bestätigt. rnIm Jahr 2007 fanden die Projekte COPS und MAP D-PHASE statt und boten die Möglich-keit, 19 Modelle aus drei Modellkategorien hinsichtlich ihrer Vorhersageleistung in Südwestdeutschland für Akkumulationszeiträume von 6 und 12 Stunden miteinander zu vergleichen. Als Ergebnisse besonders hervorzuheben sind, dass (i) je kleiner der Gitter-punktabstand der Modelle ist, desto realistischer sind die simulierten Niederschlags-verteilungen; (ii) bei der Niederschlagsmenge wird in den hoch aufgelösten Modellen weniger Niederschlag, d.h. meist zu wenig, simuliert und (iii) die Ortskomponente wird von allen Modellen am schlechtesten simuliert. Die Analyse der Vorhersageleistung dieser Modelltypen für konvektive Situationen zeigt deutliche Unterschiede. Bei Hochdrucklagen sind die Modelle ohne Konvektionsparametrisierung nicht in der Lage diese zu simulieren, wohingegen die Modelle mit Konvektionsparametrisierung die richtige Menge, aber zu flächige Strukturen realisieren. Für konvektive Ereignisse im Zusammenhang mit Fronten sind beide Modelltypen in der Lage die Niederschlagsverteilung zu simulieren, wobei die hoch aufgelösten Modelle realistischere Felder liefern. Diese wetterlagenbezogene Unter-suchung wird noch systematischer unter Verwendung der konvektiven Zeitskala durchge-führt. Eine erstmalig für Deutschland erstellte Klimatologie zeigt einen einer Potenzfunktion folgenden Abfall der Häufigkeit dieser Zeitskala zu größeren Werten hin auf. Die SAL Ergebnisse sind für beide Bereiche dramatisch unterschiedlich. Für kleine Werte der konvektiven Zeitskala sind sie gut, dagegen werden bei großen Werten die Struktur sowie die Amplitude deutlich überschätzt. rnFür zeitlich sehr hoch aufgelöste Niederschlagsvorhersagen gewinnt der Einfluss der zeitlichen Fehler immer mehr an Bedeutung. Durch die Optimierung/Minimierung der L Komponente von SAL innerhalb eines Zeitfensters(+/-3h) mit dem Beobachtungszeit-punkt im Zentrum ist es möglich diese zu bestimmen. Es wird gezeigt, dass bei optimalem Zeitversatz die Struktur und Amplitude der QNVs für das COSMO-DE besser werden und damit die grundsätzliche Fähigkeit des Modells die Niederschlagsverteilung realistischer zu simulieren, besser gezeigt werden kann.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Le reti transeuropee sono uno dei vettori della competitività, dell’integrazione e dello sviluppo sostenibile dell’Unione. La tesi mette in luce la progressiva affermazione di una coerente politica infrastrutturale europea a carattere strumentale, esaminando tale evoluzione sotto tre profili: normativo, istituzionale e finanziario. In primo luogo, sotto il profilo normativo, la tesi evidenzia, da un lato, la progressiva emancipazione delle istituzioni dell’Unione dall’influenza degli Stati membri nell’esercizio delle proprie competenze in materia di reti transeuropee e, dall’altro, lo sviluppo di relazioni di complementarietà e specialità tra la politica di reti e altre politiche dell’Unione. L’elaborato sottolinea, in secondo luogo, sotto il profilo istituzionale, il ruolo del processo di «integrazione organica» dei regolatori nazionali e del processo di «agenzificazione» nel perseguimento degli obiettivi di interconnessione e accesso alle reti nazionali. La tesi osserva, infine, sotto il profilo finanziario, l’accresciuta importanza del sostegno finanziario dell’UE alla costituzione delle reti, che si è accompagnata al parziale superamento dei limiti derivanti dal diritto dell’UE alla politiche di spesa pubblica infrastrutturale degli Stati membri. Da un lato rispetto al diritto della concorrenza e, in particolare, al divieto di aiuti di stato, grazie al rapporto funzionale tra reti e prestazione di servizi di interesse economico generale, e dall’altro lato riguardo ai vincoli di bilancio, attraverso un’interpretazione evolutiva della cd. investment clause del Patto di stabilità e crescita. La tesi, in conclusione, rileva gli sviluppi decisivi della politica di reti europea, ma sottolinea il ruolo che gli Stati membri sono destinati a continuare ad esercitare nel suo sviluppo. Da questi ultimi, infatti, dipende la concreta attuazione di tale politica, ma anche il definitivo superamento, in occasione di una prossima revisione dei Trattati, dei retaggi intergovernativi che continuano a caratterizzare il diritto primario in materia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

La tematica dell’abuso del diritto in campo fiscale ha conosciuto, negli ultimi anni, una diffusione particolarmente rilevante. Questo lavoro, dopo una necessaria premessa introduttiva alla problematica, affronta l’abuso del diritto in campo tributario tramite l’analisi degli strumenti classici dell’ermenutica, constatando come si arrivi ad un intreccio tra lo strumento della clausola generale anti-abuso e il principio di divieto d’abuso del diritto sviluppatosi a livello europeo, concretizzazione del più ampio principio dell’effettività del diritto dell’Unione Europea. L’analisi prende a modello, da un lato, la clausola generale anti-abuso tedesca, adottata già nel primo dopoguerra, e le sue diverse modifiche legislative occorse negli anni, e dall’altro, il principio europeo di divieto d’abuso del diritto. L’esame congiunto rivela un cortocircuito interpretativo, posto che il principio europeo espone gli stessi concetti della clausola nazionale tedesca pre riforma, la quale, in seguito, alle sentenze Halifax e Cadbury Schweppes, ha subito un’importante modifica, cosicchè la clausola generale abbisogna ora del princìpio europeo per essere interpretata. La tesi evidenzia, inoltre, come tale circuito sia aggravato anche da tensioni interne alle stesse Istituzioni europee, posto che, nonostante l’esistenza di un principio di elaborazione giurisprudenziale, gli Stati Membri sono stati invitati ad introdurre una clausola generale anti-abuso, la cui formulazione rimanda al principio di divieto d’abuso del diritto elaborato dalla Corte di Giustizia.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Zeitreihen sind allgegenwärtig. Die Erfassung und Verarbeitung kontinuierlich gemessener Daten ist in allen Bereichen der Naturwissenschaften, Medizin und Finanzwelt vertreten. Das enorme Anwachsen aufgezeichneter Datenmengen, sei es durch automatisierte Monitoring-Systeme oder integrierte Sensoren, bedarf außerordentlich schneller Algorithmen in Theorie und Praxis. Infolgedessen beschäftigt sich diese Arbeit mit der effizienten Berechnung von Teilsequenzalignments. Komplexe Algorithmen wie z.B. Anomaliedetektion, Motivfabfrage oder die unüberwachte Extraktion von prototypischen Bausteinen in Zeitreihen machen exzessiven Gebrauch von diesen Alignments. Darin begründet sich der Bedarf nach schnellen Implementierungen. Diese Arbeit untergliedert sich in drei Ansätze, die sich dieser Herausforderung widmen. Das umfasst vier Alignierungsalgorithmen und ihre Parallelisierung auf CUDA-fähiger Hardware, einen Algorithmus zur Segmentierung von Datenströmen und eine einheitliche Behandlung von Liegruppen-wertigen Zeitreihen.rnrnDer erste Beitrag ist eine vollständige CUDA-Portierung der UCR-Suite, die weltführende Implementierung von Teilsequenzalignierung. Das umfasst ein neues Berechnungsschema zur Ermittlung lokaler Alignierungsgüten unter Verwendung z-normierten euklidischen Abstands, welches auf jeder parallelen Hardware mit Unterstützung für schnelle Fouriertransformation einsetzbar ist. Des Weiteren geben wir eine SIMT-verträgliche Umsetzung der Lower-Bound-Kaskade der UCR-Suite zur effizienten Berechnung lokaler Alignierungsgüten unter Dynamic Time Warping an. Beide CUDA-Implementierungen ermöglichen eine um ein bis zwei Größenordnungen schnellere Berechnung als etablierte Methoden.rnrnAls zweites untersuchen wir zwei Linearzeit-Approximierungen für das elastische Alignment von Teilsequenzen. Auf der einen Seite behandeln wir ein SIMT-verträgliches Relaxierungschema für Greedy DTW und seine effiziente CUDA-Parallelisierung. Auf der anderen Seite führen wir ein neues lokales Abstandsmaß ein, den Gliding Elastic Match (GEM), welches mit der gleichen asymptotischen Zeitkomplexität wie Greedy DTW berechnet werden kann, jedoch eine vollständige Relaxierung der Penalty-Matrix bietet. Weitere Verbesserungen umfassen Invarianz gegen Trends auf der Messachse und uniforme Skalierung auf der Zeitachse. Des Weiteren wird eine Erweiterung von GEM zur Multi-Shape-Segmentierung diskutiert und auf Bewegungsdaten evaluiert. Beide CUDA-Parallelisierung verzeichnen Laufzeitverbesserungen um bis zu zwei Größenordnungen.rnrnDie Behandlung von Zeitreihen beschränkt sich in der Literatur in der Regel auf reellwertige Messdaten. Der dritte Beitrag umfasst eine einheitliche Methode zur Behandlung von Liegruppen-wertigen Zeitreihen. Darauf aufbauend werden Distanzmaße auf der Rotationsgruppe SO(3) und auf der euklidischen Gruppe SE(3) behandelt. Des Weiteren werden speichereffiziente Darstellungen und gruppenkompatible Erweiterungen elastischer Maße diskutiert.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis I present a new coarse-grained model suitable to investigate the phase behavior of rod-coil block copolymers on mesoscopic length scales. In this model the rods are represented by hard spherocylinders, whereas the coil block consists of interconnected beads. The interactions between the constituents are based on local densities. This facilitates an efficient Monte-Carlo sampling of the phase space. I verify the applicability of the model and the simulation approach by means of several examples. I treat pure rod systems and mixtures of rod and coil polymers. Then I append coils to the rods and investigate the role of the different model parameters. Furthermore, I compare different implementations of the model. I prove the capability of the rod-coil block copolymers in our model to exhibit typical micro-phase separated configurations as well as extraordinary phases, such as the wavy lamellar state, percolating structuresrnand clusters. Additionally, I demonstrate the metastability of the observed zigzag phase in our model. A central point of this thesis is the examination of the phase behavior of the rod-coil block copolymers in dependence of different chain lengths and interaction strengths between rods and coil. The observations of these studies are summarized in a phase diagram for rod-coil block copolymers. Furthermore, I validate a stabilization of the smectic phase with increasing coil fraction.rnIn the second part of this work I present a side project in which I derive a model permitting the simulation of tetrapods with and without grafted semiconducting block copolymers. The effect of these polymers is added in an implicit manner by effective interactions between the tetrapods. While the depletion interaction is described in an approximate manner within the Asakura-Oosawa model, the free energy penalty for the brush compression is calculated within the Alexander-de Gennes model. Recent experiments with CdSe tetrapods show that grafted tetrapods are clearly much better dispersed in the polymer matrix than bare tetrapods. My simulations confirm that bare tetrapods tend to aggregate in the matrix of excess polymers, while clustering is significantly reduced after grafting polymer chains to the tetrapods. Finally, I propose a possible extension enabling the simulation of a system with fluctuating volume and demonstrate its basic functionality. This study is originated in a cooperation with an experimental group with the goal to analyze the morphology of these systems in order to find the ideal morphology for hybrid solar cells.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Molte applicazioni sono legate a tecniche di rilassometria e risonanza magnetica nucleare (NMR). Tali applicazioni danno luogo a problemi di inversione della trasformata di Laplace discreta che è un problema notoriamente mal posto. UPEN (Uniform Penalty) è un metodo numerico di regolarizzazione utile a risolvere problemi di questo tipo. UPEN riformula l’inversione della trasformata di Laplace come un problema di minimo vincolato in cui la funzione obiettivo contiene il fit di dati e una componente di penalizzazione locale, che varia a seconda della soluzione stessa. Nella moderna spettroscopia NMR si studiano le correlazioni multidimensionali dei parametri di rilassamento longitudinale e trasversale. Per studiare i problemi derivanti dall’analisi di campioni multicomponenti è sorta la necessità di estendere gli algoritmi che implementano la trasformata inversa di Laplace in una dimensione al caso bidimensionale. In questa tesi si propone una possibile estensione dell'algoritmo UPEN dal caso monodimensionale al caso bidimensionale e si fornisce un'analisi numerica di tale estensione su dati simulati e su dati reali.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Anche la clausola Institute International Hull Clause contiene la clausola Sue and Labor Clause secondo cui le spese che sono intercorse per cercare di minimizzare la perdita saranno coperte dall’assicuratore. Comunque anche se la Sue and Labor Clause non fosse presente in una polizza assicurativa, l’assicuratore potrebbe avere l’obbligo di rimborsare l’assicurato per le misure prese affinché si evitasse la perdita, come se gli garantisse un benefit. Per meglio comprendere le argomentazioni su esposte, conviene ricordare il caso giudiziario australiano Emperor Goldmining Co. Ltd. v. Switzerland General Ins. Co. Ltd., secondo cui sostanzialmente l’assenza di una clausola Sue and Labour dalla polizza non significa addossare sul ricorrente le spese.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clinical-forensic examination of strangulation victims is an increasing part of the routine of many forensic pathology institutes. The cases examined between 2004 and 2008 at the Institute of Legal Medicine of the Hanover Medical School were retrospectively analysed. In total, the study material comprised 218 victims (175 females and 43 males). In 80.7 %, the clinical-forensic examination was performed within 24 hours after the incident. In the overwhelming number of cases, the alleged perpetrator was no stranger. 128 victims (58.7 %) had strangulation marks, 32 victims (14.7 %) ligature marks and 65 victims (29.8 %) nail marks. Four victims showed injuries of the laryngeal and pharyngeal structures (reddening, hematomas, swelling and in one case a fracture of the cricoid cartilage on both sides). Extensive petechiae were predominantly seen in the conjunctivae, the buccal mucosa and the skin of the face in cases where the victims suffered a loss of consciousness. 87 cases (39.9% were classified as potentially life-threatening and 30 cases (13.8 %) as acute life-threatening events. This classification is of legal relevance for the penalty. In addition, 60 victims experienced sexual violence. These results suggest that early clinical-forensic examination is crucial for documenting forensic evidence in support of police investigations and may deliver significant details relevant in court.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Decision trees have been proposed as a basis for modifying table based injection to reduce transient particulate spikes during the turbocharger lag period. It has been shown that decision trees can detect particulate spikes in real time. In well calibrated electronically controlled diesel engines these spikes are narrow and are encompassed by a wider NOx spike. Decision trees have been shown to pinpoint the exact location of measured opacity spikes in real time thus enabling targeted PM reduction with near zero NOx penalty. A calibrated dimensional model has been used to demonstrate the possible reduction of particulate matter with targeted injection pressure pulses. Post injection strategy optimized for near stoichiometric combustion has been shown to provide additional benefits. Empirical models have been used to calculate emission tradeoffs over the entire FTP cycle. An empirical model based transient calibration has been used to demonstrate that such targeted transient modifiers are more beneficial at lower engine-out NOx levels.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Do you know what choices you would make if faced with an ethical dilemma? This fact-based case includes situations and issues that a real citizen considered when faced with the knowledge that his employer may have been overbilling the state of North Carolina for health care. Professionals, especially those in accounting and finance positions, are likely to face serious dilemmas in the course of their careers. These situations may require them to choose between honoring a confidentiality clause in an employment contract and acting according to ethical and professional values. This case provides facts gathered from an actual case in which an individual faced this particular challenge. By working through the case, students should develop an appreciation of the pressures and personal ethical challenges they are likely to face in the workplace. By engaging in discussion and role play, students will be more likely to recognize these issues when they occur, and will have already developed critical thinking skills to help them develop a plan of action.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This project looked at the nature, contents, methods, means and legal and political effects of the influence that constitutional courts exercise upon the legislative and executive powers in the newly established democracies of Central and Eastern Europe. The basic hypothesis was that these courts work to provide a limitation of political power within the framework of the principal constitutional values and that they force the legislature and executive to exercise their powers and duties in strict accordance with the constitution. Following a study of the documentary sources, including primarily the relevant constitutional and statutory provisions and decisions of constitutional courts, Mr. Cvetkovski prepared a questionnaire on various aspects of the topics researched and sent it to the respective constitutional courts. A series of direct interviews with court officials in six of the ten countries then served to clarify a large number of questions relating to differences in procedures etc. that arose from the questionnaires. As a final stage, the findings were compared with those described in recent publications on constitutional control in general and in Central and Eastern Europe in particular. The study began by considering the constitutional and political environment of the constitutional courts' activities in controlling legislative and executive powers, which in all countries studied are based on the principles of the rule of law and the separation of powers. All courts are separate bodies with special status in terms of constitutional law and are independent of other political and judicial institutions. The range of matters within their jurisdiction is set by the constitution of the country in question but in all cases can be exercised only with the framework of procedural rules. This gives considerable significance to the question of who sets these rules and different countries have dealt with it in different ways. In some there is a special constitutional law with the same legal force as the constitution itself (Croatia), the majority of countries allow for regulation by an ordinary law, Macedonia gives the court the autonomy to create and change its own rules of procedure, while in Hungary the parliament fixes the rules on procedure at the suggestion of the constitutional court. The question of the appointment of constitutional judges was also considered and of the mechanisms for ensuring their impartiality and immunity. In the area of the courts' scope for providing normative control, considerable differences were found between the different countries. In some cases the courts' jurisdiction is limited to the normative acts of the respective parliaments, and there is generally no provision for challenging unconstitutional omissions by legislation and the executive. There are, however, some situations in which they may indirectly evaluate the constitutionality of legislative omissions, as when the constitution contains provision for a time limit on enacting legislation, when the parliament has made an omission in drafting a law which violates the constitutional provisions, or when a law grants favours to certain groups while excluding others, thereby violating the equal protection clause of the constitution. The control of constitutionality of normative acts can be either preventive or repressive, depending on whether it is implemented before or after the promulgation of the law or other enactment being challenged. In most countries in the region the constitutional courts provide only repressive control, although in Hungary and Poland the courts are competent to perform both preventive and repressive norm control, while in Romania the court's jurisdiction is limited to preventive norm control. Most countries are wary of vesting constitutional courts with preventive norm control because of the danger of their becoming too involved in the day-to-day political debate, but Mr. Cvetkovski points out certain advantages of such control. If combined with a short time limit it can provide early clarification of a constitutional issue, secondly it avoids the problems arising if a law that has been in force for some years is declared to be unconstitutional, and thirdly it may help preserve the prestige of the legislation. Its disadvantages include the difficulty of ascertaining the actual and potential consequences of a norm without the empirical experience of the administration and enforcement of the law, the desirability of a certain distance from the day-to-day arguments surrounding the political process of legislation, the possible effects of changing social and economic conditions, and the danger of placing obstacles in the way of rapid reactions to acute situations. In the case of repressive norm control, this can be either abstract or concrete. The former is initiated by the supreme state organs in order to protect abstract constitutional order and the latter is initiated by ordinary courts, administrative authorities or by individuals. Constitutional courts cannot directly oblige the legislature and executive to pass a new law and this remains a matter of legislative and executive political responsibility. In the case of Poland, the parliament even has the power to dismiss a constitutional court decision by a special majority of votes, which means that the last word lies with the legislature. As the current constitutions of Central and Eastern European countries are newly adopted and differ significantly from the previous ones, the courts' interpretative functions should ensure a degree of unification in the application of the constitution. Some countries (Bulgaria, Hungary, Poland, Slovakia and Russia) provide for the constitutional courts' decisions to have a binding role on the constitutions. While their decisions inevitably have an influence on the actions of public bodies, they do not set criteria for political behaviour, which depends rather on the overall political culture and traditions of the society. All constitutions except that of Belarus, provide for the courts to have jurisdiction over conflicts arising from the distribution of responsibilities between different organs and levels in the country, as well for impeachment procedures against the head of state, and for determining the constitutionality of political parties (except in Belarus, Hungary, Russia and Slovakia). All the constitutions studied guarantee individual rights and freedoms and most courts have jurisdiction over complaints of violation of these rights by the constitution. All courts also have some jurisdiction over international agreements and treaties, either directly (Belarus, Bulgaria and Hungary) before the treaty is ratified, or indirectly (Croatia, Czech Republic, Macedonia, Romania, Russia and Yugoslavia). In each country the question of who may initiate proceedings of norm control is of central importance and is usually regulated by the constitution itself. There are three main possibilities: statutory organs, normal courts and private individuals and the limitations on each of these is discussed in the report. Most courts are limited in their rights to institute ex officio a full-scale review of a point of law, and such rights as they do have rarely been used. In most countries courts' decisions do not have any binding force but must be approved by parliament or impose on parliament the obligation to bring the relevant law into conformity within a certain period. As a result, the courts' position is generally weaker than in other countries in Europe, with parliament remaining the supreme body. In the case of preventive norm control a finding of unconstitutionality may act to suspend the law and or to refer it back to the legislature, where in countries such as Romania it may even be overturned by a two-thirds majority. In repressive norm control a finding of unconstitutionality generally serves to take the relevant law out of legal force from the day of publication of the decision or from another date fixed by the court. If the law is annulled retrospectively this may or may not bring decisions of criminal courts under review, depending on the provisions laid down in the relevant constitution. In cases relating to conflicts of competencies the courts' decisions tend to be declaratory and so have a binding effect inter partes. In the case of a review of an individual act, decisions generally become effective primarily inter partes but is the individual act has been based on an unconstitutional generally binding normative act of the legislature or executive, the findings has quasi-legal effect as it automatically initiates special proceedings in which the law or other regulation is to be annulled or abrogated with effect erga omnes. This wards off further application of the law and thus further violations of individual constitutional rights, but also discourages further constitutional complaints against the same law. Thus the success of one individual's complaint extends to everyone else whose rights have equally been or might have been violated by the respective law. As the body whose act is repealed is obliged to adopt another act and in doing so is bound by the legal position of the constitutional court on the violation of constitutionally guaranteed freedoms and rights of the complainant, in this situation the decision of the constitutional court has the force of a precedent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerous time series studies have provided strong evidence of an association between increased levels of ambient air pollution and increased levels of hospital admissions, typically at 0, 1, or 2 days after an air pollution episode. An important research aim is to extend existing statistical models so that a more detailed understanding of the time course of hospitalization after exposure to air pollution can be obtained. Information about this time course, combined with prior knowledge about biological mechanisms, could provide the basis for hypotheses concerning the mechanism by which air pollution causes disease. Previous studies have identified two important methodological questions: (1) How can we estimate the shape of the distributed lag between increased air pollution exposure and increased mortality or morbidity? and (2) How should we estimate the cumulative population health risk from short-term exposure to air pollution? Distributed lag models are appropriate tools for estimating air pollution health effects that may be spread over several days. However, estimation for distributed lag models in air pollution and health applications is hampered by the substantial noise in the data and the inherently weak signal that is the target of investigation. We introduce an hierarchical Bayesian distributed lag model that incorporates prior information about the time course of pollution effects and combines information across multiple locations. The model has a connection to penalized spline smoothing using a special type of penalty matrix. We apply the model to estimating the distributed lag between exposure to particulate matter air pollution and hospitalization for cardiovascular and respiratory disease using data from a large United States air pollution and hospitalization database of Medicare enrollees in 94 counties covering the years 1999-2002.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Virtualization has become a common abstraction layer in modern data centers. By multiplexing hardware resources into multiple virtual machines (VMs) and thus enabling several operating systems to run on the same physical platform simultaneously, it can effectively reduce power consumption and building size or improve security by isolating VMs. In a virtualized system, memory resource management plays a critical role in achieving high resource utilization and performance. Insufficient memory allocation to a VM will degrade its performance dramatically. On the contrary, over-allocation causes waste of memory resources. Meanwhile, a VM’s memory demand may vary significantly. As a result, effective memory resource management calls for a dynamic memory balancer, which, ideally, can adjust memory allocation in a timely manner for each VM based on their current memory demand and thus achieve the best memory utilization and the optimal overall performance. In order to estimate the memory demand of each VM and to arbitrate possible memory resource contention, a widely proposed approach is to construct an LRU-based miss ratio curve (MRC), which provides not only the current working set size (WSS) but also the correlation between performance and the target memory allocation size. Unfortunately, the cost of constructing an MRC is nontrivial. In this dissertation, we first present a low overhead LRU-based memory demand tracking scheme, which includes three orthogonal optimizations: AVL-based LRU organization, dynamic hot set sizing and intermittent memory tracking. Our evaluation results show that, for the whole SPEC CPU 2006 benchmark suite, after applying the three optimizing techniques, the mean overhead of MRC construction is lowered from 173% to only 2%. Based on current WSS, we then predict its trend in the near future and take different strategies for different prediction results. When there is a sufficient amount of physical memory on the host, it locally balances its memory resource for the VMs. Once the local memory resource is insufficient and the memory pressure is predicted to sustain for a sufficiently long time, a relatively expensive solution, VM live migration, is used to move one or more VMs from the hot host to other host(s). Finally, for transient memory pressure, a remote cache is used to alleviate the temporary performance penalty. Our experimental results show that this design achieves 49% center-wide speedup.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

An extrusion die is used to continuously produce parts with a constant cross section; such as sheets, pipes, tire components and more complex shapes such as window seals. The die is fed by a screw extruder when polymers are used. The extruder melts, mixes and pressures the material by the rotation of either a single or double screw. The polymer can then be continuously forced through the die producing a long part in the shape of the die outlet. The extruded section is then cut to the desired length. Generally, the primary target of a well designed die is to produce a uniform outlet velocity without excessively raising the pressure required to extrude the polymer through the die. Other properties such as temperature uniformity and residence time are also important but are not directly considered in this work. Designing dies for optimal outlet velocity variation using simple analytical equations are feasible for basic die geometries or simple channels. Due to the complexity of die geometry and of polymer material properties design of complex dies by analytical methods is difficult. For complex dies iterative methods must be used to optimize dies. An automated iterative method is desired for die optimization. To automate the design and optimization of an extrusion die two issues must be dealt with. The first is how to generate a new mesh for each iteration. In this work, this is approached by modifying a Parasolid file that describes a CAD part. This file is then used in a commercial meshing software. Skewing the initial mesh to produce a new geometry was also employed as a second option. The second issue is an optimization problem with the presence of noise stemming from variations in the mesh and cumulative truncation errors. In this work a simplex method and a modified trust region method were employed for automated optimization of die geometries. For the trust region a discreet derivative and a BFGS Hessian approximation were used. To deal with the noise in the function the trust region method was modified to automatically adjust the discreet derivative step size and the trust region based on changes in noise and function contour. Generally uniformity of velocity at exit of the extrusion die can be improved by increasing resistance across the die but this is limited by the pressure capabilities of the extruder. In optimization, a penalty factor that increases exponentially from the pressure limit is applied. This penalty can be applied in two different ways; the first only to the designs which exceed the pressure limit, the second to both designs above and below the pressure limit. Both of these methods were tested and compared in this work.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Algae are considered a promising source of biofuels in the future. However, the environmental impact of algae-based fuel has high variability in previous LCA studies due to lack of accurate data from researchers and industry. The National Alliance for Advanced Biofuels and Bioproducts (NAABB) project was designed to produce and evaluate new technologies that can be implemented by the algal biofuel industry and establish the overall process sustainability. The MTU research group within NAABB worked on the environmental sustainability part of the consortium with UOP-Honeywell and with the University of Arizona (Dr. Paul Blowers). Several life cycle analysis (LCA) models were developed within the GREET Model and SimaPro 7.3 software to quantitatively assess the environment viability and sustainability of algal fuel processes. The baseline GREET Harmonized algae life cycle was expanded and replicated in SimaPro software, important differences in emission factors between GREET/E-Grid database and SimaPro/Ecoinvent database were compared, and adjustments were made to the SimaPro analyses. The results indicated that in most cases SimaPro has a higher emission penalty for inputs of electricity, chemicals, and other materials to the algae biofuels life cycle. A system-wide model of algae life cycle was made starting with preliminary data from the literature, and then progressed to detailed analyses based on inputs from all NAABB research areas, and finally several important scenarios in the algae life cycle were investigated as variations to the baseline scenario. Scenarios include conversion to jet fuel instead of biodiesel or renewable diesel, impacts of infrastructure for algae cultivation, co-product allocation methodology, and different usage of lipid-extracted algae (LEA). The infrastructure impact of algae cultivation is minimal compared to the overall life cycle. However, in the scenarios investigating LEA usage for animal feed instead of internal recycling for energy use and nutrient recovery the results reflect the high potential variability in LCA results. Calculated life cycle GHG values for biofuel production scenarios where LEA is used as animal feed ranged from a 55% reduction to 127% increase compared to the GREET baseline scenario depending on the choice of feed meal. Different allocation methods also affect LCA results significantly. Four novel harvesting technologies and two extraction technologies provided by the NAABB internal report have been analysis using SimaPro LCA software. The results indicated that a combination of acoustic extraction and acoustic harvesting technologies show the most promising result of all combinations to optimize the extraction of algae oil from algae. These scenario evaluations provide important insights for consideration when planning for the future of an algae-based biofuel industry.