731 resultados para Fair Work Regulations 2009
Resumo:
The present work tries to display a comprehensive and comparative study of the different legal and regulatory problems involved in international securitization transactions. First, an introduction to securitization is provided, with the basic elements of the transaction, followed by the different varieties of it, including dynamic securitization and synthetic securitization structures. Together with this introduction to the intricacies of the structure, a insight into the influence of securitization in the financial and economic crisis of 2007-2009 is provided too; as well as an overview of the process of regulatory competition and cooperation that constitutes the framework for the international aspects of securitization. The next Chapter focuses on the aspects that constitute the foundations of structured finance: the inception of the vehicle, and the transfer of risks associated to the securitized assets, with particular emphasis on the validity of those elements, and how a securitization transaction could be threatened at its root. In this sense, special importance is given to the validity of the trust as an instrument of finance, to the assignment of future receivables or receivables in block, and to the importance of formalities for the validity of corporations, trusts, assignments, etc., and the interaction of such formalities contained in general corporate, trust and assignment law with those contemplated under specific securitization regulations. Then, the next Chapter (III) focuses on creditor protection aspects. As such, we provide some insights on the debate on the capital structure of the firm, and its inadequacy to assess the financial soundness problems inherent to securitization. Then, we proceed to analyze the importance of rules on creditor protection in the context of securitization. The corollary is in the rules in case of insolvency. In this sense, we divide the cases where a party involved in the transaction goes bankrupt, from those where the transaction itself collapses. Finally, we focus on the scenario where a substance over form analysis may compromise some of the elements of the structure (notably the limited liability of the sponsor, and/or the transfer of assets) by means of veil piercing, substantive consolidation, or recharacterization theories. Once these elements have been covered, the next Chapters focus on the regulatory aspects involved in the transaction. Chapter IV is more referred to “market” regulations, i.e. those concerned with information disclosure and other rules (appointment of the indenture trustee, and elaboration of a rating by a rating agency) concerning the offering of asset-backed securities to the public. Chapter V, on the other hand, focuses on “prudential” regulation of the entity entrusted with securitizing assets (the so-called Special Purpose vehicle), and other entities involved in the process. Regarding the SPV, a reference is made to licensing requirements, restriction of activities and governance structures to prevent abuses. Regarding the sponsor of the transaction, a focus is made on provisions on sound originating practices, and the servicing function. Finally, we study accounting and banking regulations, including the Basel I and Basel II Frameworks, which determine the consolidation of the SPV, and the de-recognition of the securitized asset from the originating company’s balance-sheet, as well as the posterior treatment of those assets, in particular by banks. Chapters VI-IX are concerned with liability matters. Chapter VI is an introduction to the different sources of liability. Chapter VII focuses on the liability by the SPV and its management for the information supplied to investors, the management of the asset pool, and the breach of loyalty (or fiduciary) duties. Chapter VIII rather refers to the liability of the originator as a result of such information and statements, but also as a result of inadequate and reckless originating or servicing practices. Chapter IX finally focuses on third parties entrusted with the soundness of the transaction towards the market, the so-called gatekeepers. In this respect, we make special emphasis on the liability of indenture trustees, underwriters and rating agencies. Chapters X and XI focus on the international aspects of securitization. Chapter X contains a conflicts of laws analysis of the different aspects of structured finance. In this respect, a study is made of the laws applicable to the vehicle, to the transfer of risks (either by assignment or by means of derivatives contracts), to liability issues; and a study is also made of the competent jurisdiction (and applicable law) in bankruptcy cases; as well as in cases where a substance-over-form is performed. Then, special attention is also devoted to the role of financial and securities regulations; as well as to their territorial limits, and extraterritoriality problems involved. Chapter XI supplements the prior Chapter, for it analyzes the limits to the States’ exercise of regulatory power by the personal and “market” freedoms included in the US Constitution or the EU Treaties. A reference is also made to the (still insufficient) rules from the WTO Framework, and their significance to the States’ recognition and regulation of securitization transactions.
Resumo:
The present dissertation focuses on burnout and work engagement among teachers, with especial focus on the Job-Demands Resources Model: Chapter 1 focuses on teacher burnout. It aims to investigate the role of efficacy beliefs using negatively worded inefficacy items instead of positive ones and to establish whether depersonalization and cynism can be considered two different dimensions of the teacher burnout syndrome. Chapter 2 investigates the factorial validity of the instruments used to measure work engagement (i.e. Utrecht Work Engagement Scale, UWES-17 and UWES-9). Moreover, because the current study is partly longitudinal in nature, also the stability across time of engagement can be investigated. Finally, based on cluster-analyses, two groups that differ in levels of engagement are compared as far as their job- and personal resources (i.e. possibilities for personal development, work-life balance, and self-efficacy), positive organizational attitudes and behaviours (i.e., job satisfaction and organizational citizenship behaviour) and perceived health are concerned. Chapter 3 tests the JD-R model in a longitudinal way, by integrating also the role of personal resources (i.e. self-efficacy). This chapter seeks answers to questions on what are the most important job demands, job and personal resources contributing to discriminate burned-out teachers from non-burned-out teachers, as well as engaged teachers from non-engaged teachers. Chapter 4 uses a diary study to extend knowledge about the dynamic nature of the JD-R model by considering between- and within-person variations with regard to both motivational and health impairment processes.
Resumo:
The present thesis investigates the issue of work-family conflict and facilitation in a sanitarian contest, using the DISC Model (De Jonge and Dormann, 2003, 2006). The general aim has been declined in two empirical studies reported in this dissertation chapters. Chapter 1 reporting the psychometric properties of the Demand-Induced Strain Compensation Questionnaire. Although the empirical evidence on the DISC Model has received a fair amount of attention in literature both for the theoretical principles and for the instrument developed to display them (DISQ; De Jonge, Dormann, Van Vegchel, Von Nordheim, Dollard, Cotton and Van den Tooren, 2007) there are no studies based solely on psychometric investigation of the instrument. In addition, no previous studies have ever used the DISC as a model or measurement instrument in an Italian context. Thus the first chapter of the present dissertation was based on psychometric investigation of the DISQ. Chapter 2 reporting a longitudinal study contribution. The purpose was to examine, using the DISC model, the relationship between emotional job characteristics, work-family interface and emotional exhaustion among a health care population. We started testing the Triple Match Principle of the DISC Model using solely the emotional dimension of the strain-stress process (i.e. emotional demands, emotional resources and emotional exhaustion). Then we investigated the mediator role played by w-f conflict and w-f facilitation in relation to emotional job characteristics and emotional exhaustion. Finally we compared the mediator model across workers involved in chronic illness home demands and workers who are not involved. Finally, a general conclusion, integrated and discussed the main findings of the studies reported in this dissertation.
Resumo:
The proton-nucleus elastic scattering at intermediate energies is a well-established method for the investigation of the nuclear matter distribution in stable nuclei and was recently applied also for the investigation of radioactive nuclei using the method of inverse kinematics. In the current experiment, the differential cross sections for proton elastic scattering on the isotopes $^{7,9,10,11,12,14}$Be and $^8$B were measured. The experiment was performed using the fragment separator at GSI, Darmstadt to produce the radioactive beams. The main part of the experimental setup was the time projection ionization chamber IKAR which was simultaneously used as hydrogen target and a detector for the recoil protons. Auxiliary detectors for projectile tracking and isotope identification were also installed. As results from the experiment, the absolute differential cross sections d$sigma$/d$t$ as a function of the four momentum transfer $t$ were obtained. In this work the differential cross sections for elastic p-$^{12}$Be, p-$^{14}$Be and p-$^{8}$B scattering at low $t$ ($t leq$~0.05~(GeV/c)$^2$) are presented. The measured cross sections were analyzed within the Glauber multiple-scattering theory using different density parameterizations, and the nuclear matter density distributions and radii of the investigated isotopes were determined. The analysis of the differential cross section for the isotope $^{14}$Be shows that a good description of the experimental data is obtained when density distributions consisting of separate core and halo components are used. The determined {it rms} matter radius is $3.11 pm 0.04 pm 0.13$~fm. In the case of the $^{12}$Be nucleus the results showed an extended matter distribution as well. For this nucleus a matter radius of $2.82 pm 0.03 pm 0.12$~fm was determined. An interesting result is that the free $^{12}$Be nucleus behaves differently from the core of $^{14}$Be and is much more extended than it. The data were also compared with theoretical densities calculated within the FMD and the few-body models. In the case of $^{14}$Be, the calculated cross sections describe the experimental data well while, in the case of $^{12}$Be there are discrepancies in the region of high momentum transfer. Preliminary experimental results for the isotope $^8$B are also presented. An extended matter distribution was obtained (though much more compact as compared to the neutron halos). A proton halo structure was observed for the first time with the proton elastic scattering method. The deduced matter radius is $2.60pm 0.02pm 0.26$~fm. The data were compared with microscopic calculations in the frame of the FMD model and reasonable agreement was observed. The results obtained in the present analysis are in most cases consistent with the previous experimental studies of the same isotopes with different experimental methods (total interaction and reaction cross section measurements, momentum distribution measurements). For future investigation of the structure of exotic nuclei a universal detector system EXL is being developed. It will be installed at the NESR at the future FAIR facility where higher intensity beams of radioactive ions are expected. The usage of storage ring techniques provides high luminosity and low background experimental conditions. Results from the feasibility studies of the EXL detector setup, performed at the present ESR storage ring, are presented.
Resumo:
Im Rahmen dieser Arbeit wurde ein flugzeuggetragenes Laserablations-Einzelpartikel-Massenspektrometer von Grund auf entworfen, gebaut, charakterisiert und auf verschiedenen Feldmesskampagnen eingesetzt. Das ALABAMA (Aircraft-based Laser ABlation Aerosol MAss Spectrometer) ist in der Lage die chemische Zusammensetzung und Größe von einzelnen Aerosolpartikeln im submikrometer-Bereich (135 – 900 nm) zu untersuchen.rnNach dem Fokussieren in einer aerodynamischen Linse wird dafür zunächst derrnaerodynamische Durchmesser der einzelnen Partikel mit Hilfe einer Flugzeitmessung zwischen zwei Dauerstrichlasern bestimmt. Anschließend werden die zuvor detektierten und klassifizierten Partikel durch einen gezielten Laserpuls einzeln verdampft und ionisiert. Die Ionen werden in einem bipolaren Flugzeit-Massenspektrometer entsprechend ihrem Masse zu- Ladungs Verhältnisses getrennt und detektiert. Die entstehenden Massenspektren bieten einen detaillierten Einblick in die chemische Struktur der einzelnen Partikel.rnDas gesamte Instrument wurde so konzipiert, dass es auf dem neuen Höhenforschungsflugzeug HALO und anderen mobilen Plattformen eingesetzt werden kann. Um dies zu ermöglichen wurden alle Komponenten in einem Rahmen mit weniger als 0.45 m³ Volumen untergebracht. Das gesamte Instrument inklusive Rahmen wiegt weniger als 150 kg und erfüllt die strengen sicherheitsvorschriften für den Betrieb an Bord von Forschungsflugzeugen. Damit ist ALABAMA das kleinste und leichteste Instrument seiner Art.rnNach dem Aufbau wurden die Eigenschaften und Grenzen aller Komponenten detailliert im Labor und auf Messkampagnen charakterisiert. Dafür wurden zunächst die Eigenschaften des Partikelstrahls, wie beispielsweise Strahlbreite und –divergenz, ausführlich untersucht. Die Ergebnisse waren wichtig, um die späteren Messungen der Detektions- und Ablationseffizienz zu validieren.rnBei den anschließenden Effizienzmessungen wurde gezeigt, dass abhängig von ihrer Größe und Beschaffenheit, bis zu 86 % der vorhandenen Aerosolpartikel erfolgreich detektiert und größenklassifiziert werden. Bis zu 99.5 % der detektierten Partikel konnten ionisiert und somit chemisch untersucht werden. Diese sehr hohen Effizienzen sind insbesondere für Messungen in großer Höhe entscheidend, da dort zum Teil nur sehr geringe Partikelkonzentrationen vorliegen.rnDas bipolare Massenspektrometer erzielt durchschnittliche Massenauflösungen von bis zu R=331. Während Labor- und Feldmessungen konnten dadurch Elemente wie Au, Rb, Co, Ni, Si, Ti und Pb eindeutig anhand ihres Isotopenmusters zugeordnet werden.rnErste Messungen an Bord eines ATR-42 Forschungsflugzeuges während der MEGAPOLI Kampagne in Paris ergaben einen umfassenden Datensatz von Aerosolpartikeln innerhalb der planetaren Grenzschicht. Das ALABAMA konnte unter harten physischen Bedingungen (Temperaturen > 40°C, Beschleunigungen +/- 2 g) verlässlich und präzise betrieben werden. Anhand von charakteristischen Signalen in den Massenspektren konnten die Partikel zuverlässig in 8 chemische Klassen unterteilt werden. Einzelne Klassen konnten dabei bestimmten Quellen zugeordnet werden. So ließen sich beispielsweise Partikel mit starkerrnNatrium- und Kaliumsignatur eindeutig auf die Verbrennung von Biomasse zurückführen.rnALABAMA ist damit ein wertvolles Instrument um Partikel in-situ zu charakterisieren und somit verschiedenste wissenschaftliche Fragestellungen, insbesondere im Bereich der Atmosphärenforschung, zu untersuchen.
Resumo:
The availability of a high-intensity antiproton beam with momentum up to 15,GeV/c at the future FAIR will open a unique opportunity to investigate wide areas of nuclear physics with the $overline{P}$ANDA (anti{$overline{P}$}roton ANnihilations at DArmstadt) detector. Part of these investigations concern the Electromagnetic Form Factors of the proton in the time-like region and the study of the Transition Distribution Amplitudes, for which feasibility studies have been performed in this Thesis. rnMoreover, simulations to study the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter of $overline{P}$ANDA are presented. This detector is crucial especially for the reconstruction of processes like $bar pprightarrow e^+ e^- pi^0$, investigated in this work. Different arrangements of dead material were studied. The results show that both, the efficiency and the energy resolution of the backward endcap of the electromagnetic calorimeter fullfill the requirements for the detection of backward particles, and that this detector is necessary for the reconstruction of the channels of interest. rnrnThe study of the annihilation channel $bar pprightarrow e^+ e^-$ will improve the knowledge of the Electromagnetic Form Factors in the time-like region, and will help to understand their connection with the Electromagnetic Form Factors in the space-like region. In this Thesis the feasibility of a measurement of the $bar pprightarrow e^+ e^-$ cross section with $overline{P}$ANDA is studied using Monte-Carlo simulations. The major background channel $bar pprightarrow pi^+ pi^-$ is taken into account. The results show a $10^9$ background suppression factor, which assure a sufficiently clean signal with less than 0.1% background contamination. The signal can be measured with an efficiency greater than 30% up to $s=14$,(GeV/c)$^2$. The Electromagnetic Form Factors are extracted from the reconstructed signal and corrected angular distribution. Above this $s$ limit, the low cross section will not allow the direct extraction of the Electromagnetic Form Factors. However, the total cross section can still be measured and an extraction of the Electromagnetic Form Factors is possible considering certain assumptions on the ratio between the electric and magnetic contributions.rnrnThe Transition Distribution Amplitudes are new non-perturbative objects describing the transition between a baryon and a meson. They are accessible in hard exclusive processes like $bar pprightarrow e^+ e^- pi^0$. The study of this process with $overline{P}$ANDA will test the Transition Distribution Amplitudes approach. This work includes a feasibility study for measuring this channel with $overline{P}$ANDA. The main background reaction is here $bar pprightarrow pi^+ pi^- pi^0$. A background suppression factor of $10^8$ has been achieved while keeping a signal efficiency above 20%.rnrnrnPart of this work has been published in the European Physics Journal A 44, 373-384 (2010).rn
Resumo:
Diese Dissertation basiert auf einem theoretischen Artikel und zwei empirischen Studien.rnrnDer theoretische Artikel: Es wird ein theoretisches Rahmenmodell postuliert, welches die Kumulierung von Arbeitsunterbrechung und deren Effekte untersucht. Die meisten bisherigen Studien haben Unterbrechungen als isoliertes Phänomen betrachtet und dabei unberücksichtigt gelassen, dass während eines typischen Arbeitstages mehrere Unterbrechungen gleichzeitig (oder aufeinanderfolgend) auftreten. In der vorliegenden Dissertation wird diese Lücke gefüllt, indem der Prozess der kumulierenden Unterbrechungen untersucht wird. Es wird beschrieben,rninwieweit die Kumulation von Unterbrechungen zu einer neuen Qualität vonrn(negativen) Effekten führt. Das Zusammenspiel und die gegenseitige Verstärkung einzelner Effekte werden dargestellt und moderierende und mediierende Faktoren aufgezeigt. Auf diese Weise ist es möglich, eine Verbindung zwischen kurzfristigen Effekten einzelner Unterbrechungen und Gesundheitsbeeinträchtigungen durch die Arbeitsbedingung ‚Unterbrechungen‘rnherzustellen.rnrnStudie 1: In dieser Studie wurde untersucht, inwieweit Unterbrechungen Leistung und Wohlbefinden einer Person innerhalb eines Arbeitstages beeinflussen. Es wurde postuliert, dass das Auftreten von Unterbrechungen die Zufriedenheit mit der eigenen Leistung vermindert und das Vergessen von Intentionen und das Irritationserleben verstärkt. Geistige Anforderung und Zeitdruck galten hierbei als Mediatoren. Um dies zu testen, wurden 133 Pflegekräften über 5 Tage hinweg mittels Smartphones befragt. Mehrebenenanalysen konnten die Haupteffekte bestätigen. Die vermuteten Mediationseffekte wurden für Irritation und (teilweise) für Zufriedenheit mit der Leistung bestätigt, nicht jedoch für Vergessen von Intentionen. Unterbrechungen führen demzufolge (u.a.) zu negativen Effekten, da sie kognitiv anspruchsvoll sind und Zeit beanspruchen.rnrnStudie 2: In dieser Studie wurden Zusammenhänge zwischen kognitiven Stressorenrn(Arbeitsunterbrechungen und Multitasking) und Beanspruchungsfolgen (Stimmung und Irritation) innerhalb eines Arbeitstages gemessen. Es wurde angenommen, dass diese Zusammenhänge durch chronologisches Alter und Indikatoren funktionalen Alters (Arbeitsgedächtniskapazität und Aufmerksamkeit) moderiert wird. Ältere mit schlechteren Aufmerksamkeitsund Arbeitsgedächtnisleistungen sollten am stärksten durch die untersuchten Stressoren beeinträchtigt werden. Es wurde eine Tagebuchstudie (siehe Studie 1) und computergestützternkognitive Leistungstests durchgeführt. Mehrebenenanalysen konnten die Haupteffekte für die abhängigen Variablen Stimmung (Valenz und Wachheit) und Irritation bestätigen, nicht jedoch für Erregung (Stimmung). Dreifachinteraktionen wurden nicht in der postulierten Richtung gefunden. Jüngere, nicht Ältere profitierten von einem hohen basalen kognitivenrnLeistungsvermögen. Ältere scheinen Copingstrategien zu besitzen, die mögliche kognitive Verluste ausgleichen. rnrnIm Allgemeinen konnten die (getesteten) Annahmen des theoretischen Rahmenmodellsrnbestätigt werden. Prinzipiell scheint es möglich, Ergebnisse der Laborforschung auf die Feldforschung zu übertragen, jedoch ist es notwendig die Besonderheiten des Feldes zu berücksichtigen. Die postulieren Mediationseffekte (Studie 1) wurden (teilweise) bestätigt. Die Ergebnisse weisen jedoch darauf hin, dass der volle Arbeitstag untersucht werden muss und dass sehr spezifische abhängige Variablen auch spezifischere Mediatoren benötigen. Des Weiteren konnte in Studie 2 bestätigt werden, dass die kognitive Kapazität eine bedeutsamernRessource im Umgang mit Unterbrechungen ist, im Arbeitskontext jedoch auch andere Ressourcen wirken.
Resumo:
The aim of the study was to examine the economic performance as well as perceived social and environmental impacts of organic cotton in Southern Kyrgyzstan on the basis of a comparative field study (44 certified organic farmers and 33 conventional farmers) carried out in 2009. It also investigated farmers’ motivation for and assessment of conversion to organic farming. Cotton yields on organic farms were found to be 10% lower whereby input costs per unit were 42% lower, which resulted in organic farmers having a 20% higher revenue from cotton. Due to lower input costs and organic and fair trade price premiums the average gross margin from organic cotton was 27%. In addition to direct economic benefits organic farmers enjoy a number of additional benefits such as easy access to credits on favourable terms, provision with uncontaminated cotton cooking oil and seed cake as animal feed, marketing support as well as extension and training, services provided by the newly established organic service provider. A big majority of organic farmers perceives an improvement of soil qualities, improved health conditions, and positively assesses their previous decision to convert to organic farming. The major disadvantage of organic farming is the high manual labour input required. In the study area, where manual farm work is mainly women’s work and male labour migration widespread, women are most affected by this negative aspect of organic farming. Altogether, the results suggest that despite the inconvenience of higher work load the advantages of organic farming outweigh the disadvantages and that conversion to organic farming can improve the livelihoods of small-scale farmers.
Resumo:
Work-hour regulations for residency programmes in Switzerland, including a 50-hour weekly limit, were set in on 1 January 2005. Patient safety was one of the major arguments for the implementation. As the effect of the restriction of residency work hours on patient care in Switzerland has not yet been evaluated on objective data, the aim of the present study was to assess its impact by comparing the patients' morbidity and mortality before (2001-2004) and after (2005-2008) the implementation.
Resumo:
RATIONALE AND OBJECTIVES: A feasibility study on measuring kidney perfusion by a contrast-free magnetic resonance (MR) imaging technique is presented. MATERIALS AND METHODS: A flow-sensitive alternating inversion recovery (FAIR) prepared true fast imaging with steady-state precession (TrueFISP) arterial spin labeling sequence was used on a 3.0-T MR-scanner. The basis for quantification is a two-compartment exchange model proposed by Parkes that corrects for diverse assumptions in single-compartment standard models. RESULTS: Eleven healthy volunteers (mean age, 42.3 years; range 24-55) were examined. The calculated mean renal blood flow values for the exchange model (109 +/- 5 [medulla] and 245 +/- 11 [cortex] ml/min - 100 g) are in good agreement with the literature. Most important, the two-compartment exchange model exhibits a stabilizing effect on the evaluation of perfusion values if the finite permeability of the vessel wall and the venous outflow (fast solution) are considered: the values for the one-compartment standard model were 93 +/- 18 (medulla) and 208 +/- 37 (cortex) ml/min - 100 g. CONCLUSION: This improvement will increase the accuracy of contrast-free imaging of kidney perfusion in treatment renovascular disease.
Resumo:
Volcanoes are the surficial expressions of complex pathways that vent magma and gasses generated deep in the Earth. Geophysical data record at least the partial history of magma and gas movement in the conduit and venting to the atmosphere. This work focuses on developing a more comprehensive understanding of explosive degassing at Fuego volcano, Guatemala through observations and analysis of geophysical data collected in 2005 – 2009. A pattern of eruptive activity was observed during 2005 – 2007 and quantified with seismic and infrasound, satellite thermal and gas measurements, and lava flow lengths. Eruptive styles are related to variable magma flux and accumulation of gas. Explosive degassing was recorded on broadband seismic and infrasound sensors in 2008 and 2009. Explosion energy partitioning between the ground and the atmosphere shows an increase in acoustic energy from 2008 to 2009, indicating a shift toward increased gas pressure in the conduit. Very-long-period (VLP) seismic signals are associated with the strongest explosions recorded in 2009 and waveform modeling in the 10 – 30 s band produces a best-fit source location 300 m west and 300 m below the summit crater. The calculated moment tensor indicates a volumetric source, which is modeled as a dike feeding a SW-dipping (35°) sill. The sill is the dominant component and its projection to the surface nearly intersects the summit crater. The deformation history of the sill is interpreted as: 1) an initial inflation due to pressurization, followed by 2) a rapid deflation as overpressure is explosively release, and finally 3) a reinflation as fresh magma flows into the sill and degasses. Tilt signals are derived from the horizontal components of the seismometer and show repetitive inflation deflation cycles with a 20 minute period coincident with strong explosions. These cycles represent the pressurization of the shallow conduit and explosive venting of overpressure that develops beneath a partially crystallized plug of magma. The energy released during the strong explosions has allowed for imaging of Fuego’s shallow conduit, which appears to have migrated west of the summit crater. In summary, Fuego is becoming more gas charged and its summit centered vent is shifting to the west - serious hazard consequences are likely.
Resumo:
Advances in information technology and global data availability have opened the door for assessments of sustainable development at a truly macro scale. It is now fairly easy to conduct a study of sustainability using the entire planet as the unit of analysis; this is precisely what this work set out to accomplish. The study began by examining some of the best known composite indicator frameworks developed to measure sustainability at the country level today. Most of these were found to value human development factors and a clean local environment, but to gravely overlook consumption of (remote) resources in relation to nature’s capacity to renew them, a basic requirement for a sustainable state. Thus, a new measuring standard is proposed, based on the Global Sustainability Quadrant approach. In a two‐dimensional plot of nations’ Human Development Index (HDI) vs. their Ecological Footprint (EF) per capita, the Sustainability Quadrant is defined by the area where both dimensions satisfy the minimum conditions of sustainable development: an HDI score above 0.8 (considered ‘high’ human development), and an EF below the fair Earth‐share of 2.063 global hectares per person. After developing methods to identify those countries that are closest to the Quadrant in the present‐day and, most importantly, those that are moving towards it over time, the study tackled the question: what indicators of performance set these countries apart? To answer this, an analysis of raw data, covering a wide array of environmental, social, economic, and governance performance metrics, was undertaken. The analysis used country rank lists for each individual metric and compared them, using the Pearson Product Moment Correlation function, to the rank lists generated by the proximity/movement relative to the Quadrant measuring methods. The analysis yielded a list of metrics which are, with a high degree of statistical significance, associated with proximity to – and movement towards – the Quadrant; most notably: Favorable for sustainable development: use of contraception, high life expectancy, high literacy rate, and urbanization. Unfavorable for sustainable development: high GDP per capita, high language diversity, high energy consumption, and high meat consumption. A momentary gain, but a burden in the long‐run: high carbon footprint and debt. These results could serve as a solid stepping stone for the development of more reliable composite index frameworks for assessing countries’ sustainability.
Resumo:
The capability to detect combustion in a diesel engine has the potential of being an important control feature to meet increasingly stringent emission regulations, develop alternative combustion strategies, and use of biofuels. In this dissertation, block mounted accelerometers were investigated as potential feedback sensors for detecting combustion characteristics in a high-speed, high pressure common rail (HPCR), 1.9L diesel engine. Accelerometers were positioned in multiple placements and orientations on the engine, and engine testing was conducted under motored, single and pilot-main injection conditions. Engine tests were conducted at varying injection timings, engine loads, and engine speeds to observe the resulting time and frequency domain changes of the cylinder pressure and accelerometer signals. The frequency content of the cylinder pressure based signals and the accelerometer signals between 0.5 kHz and 6 kHz indicated a strong correlation with coherence values of nearly 1. The accelerometers were used to produce estimated combustion signals using the Frequency Response Functions (FRF) measured from the frequency domain characteristics of the cylinder pressure signals and the response of the accelerometers attached to the engine block. When compared to the actual combustion signals, the estimated combustion signals produced from the accelerometer response had Root Mean Square Errors (RMSE) between 7% and 25% of the actual signals peak value. Weighting the FRF’s from multiple test conditions along their frequency axis with the coherent output power reduced the median RMSE of the estimated combustion signals and the 95th percentile of RMSE produced from each test condition. The RMSE’s of the magnitude based combustion metrics including peak cylinder pressure, MPG, peak ROHR, and work estimated from the combustion signals produced by the accelerometer responses were between 15% and 50% of their actual value. The MPG measured from the estimated pressure gradient shared a direct relationship to the actual MPG. The location based combustion metrics such as the location of peak values and burn durations were capable of RMSE measurements as low as 0.9°. Overall, accelerometer based combustion sensing system was capable of detecting combustion and providing feedback regarding the in cylinder combustion process