902 resultados para intensive utilization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

[EN] We hypothesized that reliance on lactate as a means of energy distribution is higher after a prolonged period of acclimatization (9 wk) than it is at sea level due to a higher lactate Ra and disposal from active skeletal muscle. To evaluate this hypothesis, six Danish lowlanders (25 +/- 2 yr) were studied at rest and during 20 min of bicycle exercise at 146 W at sea level (SL) and after 9 wk of acclimatization to 5,260 m (Alt). Whole body glucose Ra was similar at SL and Alt at rest and during exercise. Lactate Ra was also similar for the two conditions at rest; however, during exercise, lactate Ra was substantially lower at SL (65 micro mol. min(-1). kg body wt(-1)) than it was at Alt (150 micro mol. min(-1). kg body wt(-1)) at the same exercise intensity. During exercise, net lactate release was approximately 6-fold at Alt compared with SL, and related to this, tracer-calculated leg lactate uptake and release were both 3- or 4-fold higher at Alt compared with SL. The contribution of the two legs to glucose disposal was similar at SL and Alt; however, the contribution of the two legs to lactate Ra was significantly lower at rest and during exercise at SL (27 and 81%) than it was at Alt (45 and 123%). In conclusion, at rest and during exercise at the same absolute workload, CHO and blood glucose utilization were similar at SL and at Alt. Leg net lactate release was severalfold higher, and the contribution of leg lactate release to whole body lactate Ra was higher at Alt compared with SL. During exercise, the relative contribution of lactate oxidation to whole body CHO oxidation was substantially higher at Alt compared with SL as a result of increased uptake and subsequent oxidation of lactate by the active skeletal muscles.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Máster Oficial en Cultivos Marinos. Trabajo presentado como requisito parcial para la obtención del Título de Máster Oficial en Cultivos Marinos, otorgado por la Universidad de Las Palmas de Gran Canaria (ULPGC), el Instituto Canario de Ciencias Marinas (ICCM), y el Centro Internacional de Altos Estudios Agronómicos Mediterráneos de Zaragoza (CIHEAM)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relation between the intercepted light and orchard productivity was considered linear, although this dependence seems to be more subordinate to planting system rather than light intensity. At whole plant level not always the increase of irradiance determines productivity improvement. One of the reasons can be the plant intrinsic un-efficiency in using energy. Generally in full light only the 5 – 10% of the total incoming energy is allocated to net photosynthesis. Therefore preserving or improving this efficiency becomes pivotal for scientist and fruit growers. Even tough a conspicuous energy amount is reflected or transmitted, plants can not avoid to absorb photons in excess. The chlorophyll over-excitation promotes the reactive species production increasing the photoinhibition risks. The dangerous consequences of photoinhibition forced plants to evolve a complex and multilevel machine able to dissipate the energy excess quenching heat (Non Photochemical Quenching), moving electrons (water-water cycle , cyclic transport around PSI, glutathione-ascorbate cycle and photorespiration) and scavenging the generated reactive species. The price plants must pay for this equipment is the use of CO2 and reducing power with a consequent decrease of the photosynthetic efficiency, both because some photons are not used for carboxylation and an effective CO2 and reducing power loss occurs. Net photosynthesis increases with light until the saturation point, additional PPFD doesn’t improve carboxylation but it rises the efficiency of the alternative pathways in energy dissipation but also ROS production and photoinhibition risks. The wide photo-protective apparatus, although is not able to cope with the excessive incoming energy, therefore photodamage occurs. Each event increasing the photon pressure and/or decreasing the efficiency of the described photo-protective mechanisms (i.e. thermal stress, water and nutritional deficiency) can emphasize the photoinhibition. Likely in nature a small amount of not damaged photosystems is found because of the effective, efficient and energy consuming recovery system. Since the damaged PSII is quickly repaired with energy expense, it would be interesting to investigate how much PSII recovery costs to plant productivity. This PhD. dissertation purposes to improve the knowledge about the several strategies accomplished for managing the incoming energy and the light excess implication on photo-damage in peach. The thesis is organized in three scientific units. In the first section a new rapid, non-intrusive, whole tissue and universal technique for functional PSII determination was implemented and validated on different kinds of plants as C3 and C4 species, woody and herbaceous plants, wild type and Chlorophyll b-less mutant and monocot and dicot plants. In the second unit, using a “singular” experimental orchard named “Asymmetric orchard”, the relation between light environment and photosynthetic performance, water use and photoinhibition was investigated in peach at whole plant level, furthermore the effect of photon pressure variation on energy management was considered on single leaf. In the third section the quenching analysis method suggested by Kornyeyev and Hendrickson (2007) was validate on peach. Afterwards it was applied in the field where the influence of moderate light and water reduction on peach photosynthetic performances, water requirements, energy management and photoinhibition was studied. Using solar energy as fuel for life plant is intrinsically suicidal since the high constant photodamage risk. This dissertation would try to highlight the complex relation existing between plant, in particular peach, and light analysing the principal strategies plants developed to manage the incoming light for deriving the maximal benefits as possible minimizing the risks. In the first instance the new method proposed for functional PSII determination based on P700 redox kinetics seems to be a valid, non intrusive, universal and field-applicable technique, even because it is able to measure in deep the whole leaf tissue rather than the first leaf layers as fluorescence. Fluorescence Fv/Fm parameter gives a good estimate of functional PSII but only when data obtained by ad-axial and ab-axial leaf surface are averaged. In addition to this method the energy quenching analysis proposed by Kornyeyev and Hendrickson (2007), combined with the photosynthesis model proposed by von Caemmerer (2000) is a forceful tool to analyse and study, even in the field, the relation between plant and environmental factors such as water, temperature but first of all light. “Asymmetric” training system is a good way to study light energy, photosynthetic performance and water use relations in the field. At whole plant level net carboxylation increases with PPFD reaching a saturating point. Light excess rather than improve photosynthesis may emphasize water and thermal stress leading to stomatal limitation. Furthermore too much light does not promote net carboxylation improvement but PSII damage, in fact in the most light exposed plants about 50-60% of the total PSII is inactivated. At single leaf level, net carboxylation increases till saturation point (1000 – 1200 μmolm-2s-1) and light excess is dissipated by non photochemical quenching and non net carboxylative transports. The latter follows a quite similar pattern of Pn/PPFD curve reaching the saturation point at almost the same photon flux density. At middle-low irradiance NPQ seems to be lumen pH limited because the incoming photon pressure is not enough to generate the optimum lumen pH for violaxanthin de-epoxidase (VDE) full activation. Peach leaves try to cope with the light excess increasing the non net carboxylative transports. While PPFD rises the xanthophyll cycle is more and more activated and the rate of non net carboxylative transports is reduced. Some of these alternative transports, such as the water-water cycle, the cyclic transport around the PSI and the glutathione-ascorbate cycle are able to generate additional H+ in lumen in order to support the VDE activation when light can be limiting. Moreover the alternative transports seems to be involved as an important dissipative way when high temperature and sub-optimal conductance emphasize the photoinhibition risks. In peach, a moderate water and light reduction does not determine net carboxylation decrease but, diminishing the incoming light and the environmental evapo-transpiration request, stomatal conductance decreases, improving water use efficiency. Therefore lowering light intensity till not limiting levels, water could be saved not compromising net photosynthesis. The quenching analysis is able to partition absorbed energy in the several utilization, photoprotection and photo-oxidation pathways. When recovery is permitted only few PSII remained un-repaired, although more net PSII damage is recorded in plants placed in full light. Even in this experiment, in over saturating light the main dissipation pathway is the non photochemical quenching; at middle-low irradiance it seems to be pH limited and other transports, such as photorespiration and alternative transports, are used to support photoprotection and to contribute for creating the optimal trans-thylakoidal ΔpH for violaxanthin de-epoxidase. These alternative pathways become the main quenching mechanisms at very low light environment. Another aspect pointed out by this study is the role of NPQ as dissipative pathway when conductance becomes severely limiting. The evidence that in nature a small amount of damaged PSII is seen indicates the presence of an effective and efficient recovery mechanism that masks the real photodamage occurring during the day. At single leaf level, when repair is not allowed leaves in full light are two fold more photoinhibited than the shaded ones. Therefore light in excess of the photosynthetic optima does not promote net carboxylation but increases water loss and PSII damage. The more is photoinhibition the more must be the photosystems to be repaired and consequently the energy and dry matter to allocate in this essential activity. Since above the saturation point net photosynthesis is constant while photoinhibition increases it would be interesting to investigate how photodamage costs in terms of tree productivity. An other aspect of pivotal importance to be further widened is the combined influence of light and other environmental parameters, like water status, temperature and nutrition on peach light, water and phtosyntate management.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Providing support for multimedia applications on low-power mobile devices remains a significant research challenge. This is primarily due to two reasons: • Portable mobile devices have modest sizes and weights, and therefore inadequate resources, low CPU processing power, reduced display capabilities, limited memory and battery lifetimes as compared to desktop and laptop systems. • On the other hand, multimedia applications tend to have distinctive QoS and processing requirementswhichmake themextremely resource-demanding. This innate conflict introduces key research challenges in the design of multimedia applications and device-level power optimization. Energy efficiency in this kind of platforms can be achieved only via a synergistic hardware and software approach. In fact, while System-on-Chips are more and more programmable thus providing functional flexibility, hardwareonly power reduction techniques cannot maintain consumption under acceptable bounds. It is well understood both in research and industry that system configuration andmanagement cannot be controlled efficiently only relying on low-level firmware and hardware drivers. In fact, at this level there is lack of information about user application activity and consequently about the impact of power management decision on QoS. Even though operating system support and integration is a requirement for effective performance and energy management, more effective and QoSsensitive power management is possible if power awareness and hardware configuration control strategies are tightly integratedwith domain-specificmiddleware services. The main objective of this PhD research has been the exploration and the integration of amiddleware-centric energymanagement with applications and operating-system. We choose to focus on the CPU-memory and the video subsystems, since they are the most power-hungry components of an embedded system. A second main objective has been the definition and implementation of software facilities (like toolkits, API, and run-time engines) in order to improve programmability and performance efficiency of such platforms. Enhancing energy efficiency and programmability ofmodernMulti-Processor System-on-Chips (MPSoCs) Consumer applications are characterized by tight time-to-market constraints and extreme cost sensitivity. The software that runs on modern embedded systems must be high performance, real time, and even more important low power. Although much progress has been made on these problems, much remains to be done. Multi-processor System-on-Chip (MPSoC) are increasingly popular platforms for high performance embedded applications. This leads to interesting challenges in software development since efficient software development is a major issue for MPSoc designers. An important step in deploying applications on multiprocessors is to allocate and schedule concurrent tasks to the processing and communication resources of the platform. The problem of allocating and scheduling precedenceconstrained tasks on processors in a distributed real-time system is NP-hard. There is a clear need for deployment technology that addresses thesemulti processing issues. This problem can be tackled by means of specific middleware which takes care of allocating and scheduling tasks on the different processing elements and which tries also to optimize the power consumption of the entire multiprocessor platform. This dissertation is an attempt to develop insight into efficient, flexible and optimalmethods for allocating and scheduling concurrent applications tomultiprocessor architectures. It is a well-known problem in literature: this kind of optimization problems are very complex even in much simplified variants, therefore most authors propose simplified models and heuristic approaches to solve it in reasonable time. Model simplification is often achieved by abstracting away platform implementation ”details”. As a result, optimization problems become more tractable, even reaching polynomial time complexity. Unfortunately, this approach creates an abstraction gap between the optimization model and the real HW-SW platform. The main issue with heuristic or, more in general, with incomplete search is that they introduce an optimality gap of unknown size. They provide very limited or no information on the distance between the best computed solution and the optimal one. The goal of this work is to address both abstraction and optimality gaps, formulating accurate models which accounts for a number of ”non-idealities” in real-life hardware platforms, developing novel mapping algorithms that deterministically find optimal solutions, and implementing software infrastructures required by developers to deploy applications for the targetMPSoC platforms. Energy Efficient LCDBacklightAutoregulation on Real-LifeMultimediaAp- plication Processor Despite the ever increasing advances in Liquid Crystal Display’s (LCD) technology, their power consumption is still one of the major limitations to the battery life of mobile appliances such as smart phones, portable media players, gaming and navigation devices. There is a clear trend towards the increase of LCD size to exploit the multimedia capabilities of portable devices that can receive and render high definition video and pictures. Multimedia applications running on these devices require LCD screen sizes of 2.2 to 3.5 inches andmore to display video sequences and pictures with the required quality. LCD power consumption is dependent on the backlight and pixel matrix driving circuits and is typically proportional to the panel area. As a result, the contribution is also likely to be considerable in future mobile appliances. To address this issue, companies are proposing low power technologies suitable for mobile applications supporting low power states and image control techniques. On the research side, several power saving schemes and algorithms can be found in literature. Some of them exploit software-only techniques to change the image content to reduce the power associated with the crystal polarization, some others are aimed at decreasing the backlight level while compensating the luminance reduction by compensating the user perceived quality degradation using pixel-by-pixel image processing algorithms. The major limitation of these techniques is that they rely on the CPU to perform pixel-based manipulations and their impact on CPU utilization and power consumption has not been assessed. This PhDdissertation shows an alternative approach that exploits in a smart and efficient way the hardware image processing unit almost integrated in every current multimedia application processors to implement a hardware assisted image compensation that allows dynamic scaling of the backlight with a negligible impact on QoS. The proposed approach overcomes CPU-intensive techniques by saving system power without requiring either a dedicated display technology or hardware modification. Thesis Overview The remainder of the thesis is organized as follows. The first part is focused on enhancing energy efficiency and programmability of modern Multi-Processor System-on-Chips (MPSoCs). Chapter 2 gives an overview about architectural trends in embedded systems, illustrating the principal features of new technologies and the key challenges still open. Chapter 3 presents a QoS-driven methodology for optimal allocation and frequency selection for MPSoCs. The methodology is based on functional simulation and full system power estimation. Chapter 4 targets allocation and scheduling of pipelined stream-oriented applications on top of distributed memory architectures with messaging support. We tackled the complexity of the problem by means of decomposition and no-good generation, and prove the increased computational efficiency of this approach with respect to traditional ones. Chapter 5 presents a cooperative framework to solve the allocation, scheduling and voltage/frequency selection problem to optimality for energyefficient MPSoCs, while in Chapter 6 applications with conditional task graph are taken into account. Finally Chapter 7 proposes a complete framework, called Cellflow, to help programmers in efficient software implementation on a real architecture, the Cell Broadband Engine processor. The second part is focused on energy efficient software techniques for LCD displays. Chapter 8 gives an overview about portable device display technologies, illustrating the principal features of LCD video systems and the key challenges still open. Chapter 9 shows several energy efficient software techniques present in literature, while Chapter 10 illustrates in details our method for saving significant power in an LCD panel. Finally, conclusions are drawn, reporting the main research contributions that have been discussed throughout this dissertation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioinformatics is a recent and emerging discipline which aims at studying biological problems through computational approaches. Most branches of bioinformatics such as Genomics, Proteomics and Molecular Dynamics are particularly computationally intensive, requiring huge amount of computational resources for running algorithms of everincreasing complexity over data of everincreasing size. In the search for computational power, the EGEE Grid platform, world's largest community of interconnected clusters load balanced as a whole, seems particularly promising and is considered the new hope for satisfying the everincreasing computational requirements of bioinformatics, as well as physics and other computational sciences. The EGEE platform, however, is rather new and not yet free of problems. In addition, specific requirements of bioinformatics need to be addressed in order to use this new platform effectively for bioinformatics tasks. In my three years' Ph.D. work I addressed numerous aspects of this Grid platform, with particular attention to those needed by the bioinformatics domain. I hence created three major frameworks, Vnas, GridDBManager and SETest, plus an additional smaller standalone solution, to enhance the support for bioinformatics applications in the Grid environment and to reduce the effort needed to create new applications, additionally addressing numerous existing Grid issues and performing a series of optimizations. The Vnas framework is an advanced system for the submission and monitoring of Grid jobs that provides an abstraction with reliability over the Grid platform. In addition, Vnas greatly simplifies the development of new Grid applications by providing a callback system to simplify the creation of arbitrarily complex multistage computational pipelines and provides an abstracted virtual sandbox which bypasses Grid limitations. Vnas also reduces the usage of Grid bandwidth and storage resources by transparently detecting equality of virtual sandbox files based on content, across different submissions, even when performed by different users. BGBlast, evolution of the earlier project GridBlast, now provides a Grid Database Manager (GridDBManager) component for managing and automatically updating biological flatfile databases in the Grid environment. GridDBManager sports very novel features such as an adaptive replication algorithm that constantly optimizes the number of replicas of the managed databases in the Grid environment, balancing between response times (performances) and storage costs according to a programmed cost formula. GridDBManager also provides a very optimized automated management for older versions of the databases based on reverse delta files, which reduces the storage costs required to keep such older versions available in the Grid environment by two orders of magnitude. The SETest framework provides a way to the user to test and regressiontest Python applications completely scattered with side effects (this is a common case with Grid computational pipelines), which could not easily be tested using the more standard methods of unit testing or test cases. The technique is based on a new concept of datasets containing invocations and results of filtered calls. The framework hence significantly accelerates the development of new applications and computational pipelines for the Grid environment, and the efforts required for maintenance. An analysis of the impact of these solutions will be provided in this thesis. This Ph.D. work originated various publications in journals and conference proceedings as reported in the Appendix. Also, I orally presented my work at numerous international conferences related to Grid and bioinformatics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Definizione del problema: Nonostante il progresso della biotecnologia medica abbia consentito la sopravvivenza del feto, fuori dall’utero materno, ad età gestazionali sempre più basse, la prognosi a breve e a lungo termine di ogni nuovo nato resta spesso incerta e la medicina non è sempre in grado di rimediare completamente e definitivamente ai danni alla salute che spesso contribuisce a causare. Sottoporre tempestivamente i neonati estremamente prematuri alle cure intensive non ne garantisce la sopravvivenza; allo stesso modo, astenervisi non ne garantisce la morte o almeno la morte immediata; in entrambi i casi i danni alla salute (difetti della vista e dell’udito, cecità, sordità, paralisi degli arti, deficit motori, ritardo mentale, disturbi dell’apprendimento e del comportamento) possono essere gravi e permanenti ma non sono prevedibili con certezza per ogni singolo neonato. Il futuro ignoto di ogni nuovo nato, insieme allo sgretolamento di terreni morali condivisi, costringono ad affrontare laceranti dilemmi morali sull’inizio e sul rifiuto, sulla continuazione e sulla sospensione delle cure. Oggetto: Questo lavoro si propone di svolgere un’analisi critica di alcune strategie teoriche e pratiche con le quali, nell’ambito delle cure intensive ai neonati prematuri, la comunità scientifica, i bioeticisti e il diritto tentano di aggirare o di risolvere l’incertezza scientifica e morale. Tali strategie sono accomunate dalla ricerca di criteri morali oggettivi, o almeno intersoggettivi, che consentano ai decisori sostituti di pervenire ad un accordo e di salvaguardare il vero bene del paziente. I criteri esaminati vanno dai dati scientifici riguardanti la prognosi dei prematuri, a “fatti” di natura più strettamente morale come la sofferenza del paziente, il rispetto della libertà di coscienza del medico, l’interesse del neonato a sopravvivere e a vivere bene. Obiettivo: Scopo di questa analisi consiste nel verificare se effettivamente tali strategie riescano a risolvere l’incertezza morale o se invece lascino aperto il dilemma morale delle cure intensive neonatali. In quest’ultimo caso si cercherà di trovare una risposta alla domanda “chi deve decidere per il neonato?” Metodologia e strumenti: Vengono esaminati i più importanti documenti scientifici internazionali riguardanti le raccomandazioni mediche di cura e i pareri della comunità scientifica; gli studi scientifici riguardanti lo stato dell’arte delle conoscenze e degli strumenti terapeutici disponibili ad oggi; i pareri di importanti bioeticisti e gli approcci decisionali più frequentemente proposti ed adoperati; alcuni documenti giuridici internazionali riguardanti la regolamentazione della sperimentazione clinica; alcune pronunce giudiziarie significative riguardanti casi di intervento o astensione dall’intervento medico senza o contro il consenso dei genitori; alcune indagini sulle opinioni dei medici e sulla prassi medica internazionale; le teorie etiche più rilevanti riguardanti i criteri di scelta del legittimo decisore sostituto del neonato e la definizione dei suoi “migliori interessi” da un punto di vista filosofico-morale. Struttura: Nel primo capitolo si ricostruiscono le tappe più importanti della storia delle cure intensive neonatali, con particolare attenzione agli sviluppi dell’assistenza respiratoria negli ultimi decenni. In tal modo vengono messi in luce sia i cambiamenti morali e sociali prodotti dalla meccanizzazione e dalla medicalizzazione dell’assistenza neonatale, sia la continuità della medicina neonatale con il tradizionale paternalismo medico, con i suoi limiti teorici e pratici e con lo sconfinare della pratica terapeutica nella sperimentazione incontrollata. Nel secondo capitolo si sottopongono ad esame critico le prime tre strategie di soluzione dell’incertezza scientifica e morale. La prima consiste nel decidere la “sorte” di un singolo paziente in base ai dati statistici riguardanti la prognosi di tutti i nati in condizioni cliniche analoghe (“approccio statistico”); la seconda, in base alla risposta del singolo paziente alle terapie (“approccio prognostico individualizzato”); la terza, in base all’evoluzione delle condizioni cliniche individuali osservate durante un periodo di trattamento “aggressivo” abbastanza lungo da consentire la raccolta dei dati clinici utili a formulare una prognosi sicura(“approccio del trattamento fino alla certezza”). Viene dedicata una più ampia trattazione alla prima strategia perché l’uso degli studi scientifici per predire la prognosi di ogni nuovo nato accomuna i tre approcci e costituisce la strategia più diffusa ed emblematica di aggiramento dell’incertezza. Essa consiste nella costruzione di un’ “etica basata sull’evidenza”, in analogia alla “medicina basata sull’evidenza”, in quanto ambisce a fondare i doveri morali dei medici e dei genitori solo su fatti verificabili (le prove scientifiche di efficacia delle cure)apparentemente indiscutibili, avalutativi o autocertificativi della moralità delle scelte. Poiché la forza retorica di questa strategia poggia proprio su una (parziale) negazione dell’incertezza dei dati scientifici e sulla presunzione di irrilevanza della pluralità e della complessità dei valori morali nelle decisioni mediche, per metterne in luce i limiti si è scelto di dedicare la maggior parte del secondo capitolo alla discussione dei limiti di validità scientifica degli studi prognostici di cui i medici si avvalgono per predire la prognosi di ogni nuovo nato. Allo stesso scopo, in questo capitolo vengono messe in luce la falsa neutralità morale dei giudizi scientifici di “efficacia”, “prognosi buona”, “prognosi infausta” e di “tollerabilità del rischio” o dei “costi”. Nel terzo capitolo viene affrontata la questione della natura sperimentale delle cure intensive per i neonati prematuri al fine di suggerire un’ulteriore ragione dell’incertezza morale, dell’insostenibilità di obblighi medici di trattamento e della svalutazione dell’istituto del consenso libero e informato dei genitori del neonato. Viene poi documentata l’esistenza di due atteggiamenti opposti manifestati dalla comunità scientifica, dai bioeticisti e dal diritto: da una parte il silenzio sulla natura sperimentale delle terapie e dall’altra l’autocertificazione morale della sperimentazione incontrollata. In seguito si cerca di mostrare come entrambi, sebbene opposti, siano orientati ad occultare l’incertezza e la complessità delle cure ai neonati prematuri, per riaffermare, in tal modo, la precedenza dell’autorità decisionale del medico rispetto a quella dei genitori. Il quarto capitolo, cerca di rispondere alla domanda “chi deve decidere in condizioni di incertezza?”. Viene delineata, perciò, un’altra strategia di risoluzione dell’incertezza: la definizione del miglior interesse (best interest) del neonato, come oggetto, limite e scopo ultimo delle decisioni mediche, qualsiasi esse siano. Viene verificata l’ipotesi che il legittimo decisore ultimo sia colui che conosce meglio di ogni altro i migliori interessi del neonato. Perciò, in questo capitolo vengono esposte e criticate alcune teorie filosofiche sul concetto di “miglior interesse” applicato allo speciale status del neonato. Poiché quest’analisi, rivelando l’inconoscibilità del best interest del neonato, non consente di stabilire con sicurezza chi sia intitolato a prendere la decisione ultima, nell’ultimo capitolo vengono esaminate altre ragioni per le quali i genitori dovrebbero avere l’ultima parola nelle decisioni di fine o di proseguimento della vita. Dopo averle scartate, viene proposta una ragione alternativa che, sebbene non risolutiva, si ritiene abbia il merito di riconoscere e di non mortificare l’irriducibilità dell’incertezza e l’estrema complessità delle scelte morali, senza rischiare, però, la paralisi decisionale, il nichilismo morale e la risoluzione non pacifica dei conflitti.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our research asked the following main questions: how the characteristics of professionals service firms allow them to successfully innovate in exploiting through exploring by combining internal and external factors of innovation and how these ambidextrous organisations perceive these factors; and how do successful innovators in professional service firms use corporate entrepreneurship models in their new service development processes? With a goal to shed light on innovation in professional knowledge intensive business service firms’ (PKIBS), we concluded a qualitative analysis of ten globally acting law firms, providing business legal services. We analyse the internal and factors of innovation that are critical for PKIBS’ innovation. We suggest how these firms become ambidextrous in changing environment. Our findings show that this kind of firms has particular type of ambidexterity due to their specific characteristics. As PKIBS are very dependant on its human capital, governance structure, and the high expectations of their clients, their ambidexterity is structural, but also contextual at the same time. In addition, we suggest 3 types of corporate entrepreneurship models that international PKIBS use to enhance innovation in turbulent environments. We looked at how law firms going through turbulent environments were using corporate entrepreneurship activities as a part of their strategies to be more innovative. Using visual mapping methodology, we developed three types of innovation patterns in the law firms. We suggest that corporate entrepreneurship models depend on successful application of mainly three elements: who participates in corporate entrepreneurship initiatives; what are the formal processes that enhances these initiatives; and what are the policies applied to this type of behaviour.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this thesis the potential risks associated to the application of biochar in soil as well the stability of biochar were investigated. The study was focused on the potential risks arising from the occurrence of polycyclic aromatic hydrocarbons (PAHs) in biochar. An analytical method was developed for the determination of the 16 USEPA-PAHs in the original biochar and soil containing biochar. The method was successfully validated with a certified reference material for the soil matrix and compared with methods in use in other laboratories during a laboratory exercise within the EU-COST TD1107. The concentration of 16 USEPA-PAHs along with the 15 EU-PAHs, priority hazardous substances in food, was determined in a suite of currently available biochars for agricultural field applications derived from a variety of parent materials and pyrolysis conditions. Biochars analyzed contained the USEPA and some of the EU-PAHs at detectable levels ranging from 1.2 to 19 µg g-1. This method allowed investigating changes in PAH content and distribution in a four years study following biochar addition in soils in a vineyard (CNR-IBIMET). The results showed that biochar addition determined an increase of the amount of PAHs. However, the levels of PAHs in the soil remained within the maximum acceptable concentration for European countries. The vineyard soil performed by CNR-IBIMET was exploited to study the environmental stability of biochar and its impact on soil organic carbon. The stability of biochar was investigated by analytical pyrolysis (Py-GC-MS) and pyrolysis in the presence of hydrogen (HyPy). The findings showed that biochar amendment significantly influence soil stable carbon fraction concentration during the incubation period. Moreover, HyPy and Py-GC-MS were applied to biochars deriving from three different feedstock at two different pyrolysis temperatures. The results evidenced the influence of feedstock type and pyrolysis conditions on the degree of carbonisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To investigate whether next of kin can be addressed as proxy to assess patients' satisfaction with care in the intensive care unit (ICU).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Practice guidelines are systematically developed statements and recommendations that assist the physicians and patients in making decisions about appropriate health care measures for specific clinical circumstances taking into account specific national health care structures. The 1(st) revision of the S-2k guideline of the German Sepsis Society in collaboration with 17 German medical scientific societies and one self-help group provides state-of-the-art information (results of controlled clinical trials and expert knowledge) on the effective and appropriate medical care (prevention, diagnosis, therapy and follow-up care) of critically ill patients with severe sepsis or septic shock. The guideline had been developed according to the "German Instrument for Methodological Guideline Appraisal" of the Association of the Scientific Medical Societies (AWMF). In view of the inevitable advancements in scientific knowledge and technical expertise, revisions, updates and amendments must be periodically initiated. The guideline recommendations may not be applied under all circumstances. It rests with the clinician to decide whether a certain recommendation should be adopted or not, taking into consideration the unique set of clinical facts presented in connection with each individual patient as well as the available resources.