9 resultados para Instrumental reason

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

60.00% 60.00%

Publicador:

Resumo:

L’ermeneutica filosofica di Hans-Georg Gadamer – indubbiamente uno dei capisaldi del pensiero novecentesco – rappresenta una filosofia molto composita, sfaccettata e articolata, per così dire formata da una molteplicità di dimensioni diverse che si intrecciano l’una con l’altra. Ciò risulta evidente già da un semplice sguardo alla composizione interna della sua opera principale, Wahrheit und Methode (1960), nella quale si presenta una teoria del comprendere che prende in esame tre differenti dimensioni dell’esperienza umana – arte, storia e linguaggio – ovviamente concepite come fondamentalmente correlate tra loro. Ma questo quadro d’insieme si complica notevolmente non appena si prendano in esame perlomeno alcuni dei numerosi contributi che Gadamer ha scritto e pubblicato prima e dopo il suo opus magnum: contributi che testimoniano l’importante presenza nel suo pensiero di altre tematiche. Di tale complessità, però, non sempre gli interpreti di Gadamer hanno tenuto pienamente conto, visto che una gran parte dei contributi esegetici sul suo pensiero risultano essenzialmente incentrati sul capolavoro del 1960 (ed in particolare sui problemi della legittimazione delle Geisteswissenschaften), dedicando invece minore attenzione agli altri percorsi che egli ha seguito e, in particolare, alla dimensione propriamente etica e politica della sua filosofia ermeneutica. Inoltre, mi sembra che non sempre si sia prestata la giusta attenzione alla fondamentale unitarietà – da non confondere con una presunta “sistematicità”, da Gadamer esplicitamente respinta – che a dispetto dell’indubbia molteplicità ed eterogeneità del pensiero gadameriano comunque vige al suo interno. La mia tesi, dunque, è che estetica e scienze umane, filosofia del linguaggio e filosofia morale, dialogo con i Greci e confronto critico col pensiero moderno, considerazioni su problematiche antropologiche e riflessioni sulla nostra attualità sociopolitica e tecnoscientifica, rappresentino le diverse dimensioni di un solo pensiero, le quali in qualche modo vengono a convergere verso un unico centro. Un centro “unificante” che, a mio avviso, va individuato in quello che potremmo chiamare il disagio della modernità. In altre parole, mi sembra cioè che tutta la riflessione filosofica di Gadamer, in fondo, scaturisca dalla presa d’atto di una situazione di crisi o disagio nella quale si troverebbero oggi il nostro mondo e la nostra civiltà. Una crisi che, data la sua profondità e complessità, si è per così dire “ramificata” in molteplici direzioni, andando ad investire svariati ambiti dell’esistenza umana. Ambiti che pertanto vengono analizzati e indagati da Gadamer con occhio critico, cercando di far emergere i principali nodi problematici e, alla luce di ciò, di avanzare proposte alternative, rimedi, “correttivi” e possibili soluzioni. A partire da una tale comprensione di fondo, la mia ricerca si articola allora in tre grandi sezioni dedicate rispettivamente alla pars destruens dell’ermeneutica gadameriana (prima e seconda sezione) ed alla sua pars costruens (terza sezione). Nella prima sezione – intitolata Una fenomenologia della modernità: i molteplici sintomi della crisi – dopo aver evidenziato come buona parte della filosofia del Novecento sia stata dominata dall’idea di una crisi in cui verserebbe attualmente la civiltà occidentale, e come anche l’ermeneutica di Gadamer possa essere fatta rientrare in questo discorso filosofico di fondo, cerco di illustrare uno per volta quelli che, agli occhi del filosofo di Verità e metodo, rappresentano i principali sintomi della crisi attuale. Tali sintomi includono: le patologie socioeconomiche del nostro mondo “amministrato” e burocratizzato; l’indiscriminata espansione planetaria dello stile di vita occidentale a danno di altre culture; la crisi dei valori e delle certezze, con la concomitante diffusione di relativismo, scetticismo e nichilismo; la crescente incapacità a relazionarsi in maniera adeguata e significativa all’arte, alla poesia e alla cultura, sempre più degradate a mero entertainment; infine, le problematiche legate alla diffusione di armi di distruzione di massa, alla concreta possibilità di una catastrofe ecologica ed alle inquietanti prospettive dischiuse da alcune recenti scoperte scientifiche (soprattutto nell’ambito della genetica). Una volta delineato il profilo generale che Gadamer fornisce della nostra epoca, nella seconda sezione – intitolata Una diagnosi del disagio della modernità: il dilagare della razionalità strumentale tecnico-scientifica – cerco di mostrare come alla base di tutti questi fenomeni egli scorga fondamentalmente un’unica radice, coincidente peraltro a suo giudizio con l’origine stessa della modernità. Ossia, la nascita della scienza moderna ed il suo intrinseco legame con la tecnica e con una specifica forma di razionalità che Gadamer – facendo evidentemente riferimento a categorie interpretative elaborate da Max Weber, Martin Heidegger e dalla Scuola di Francoforte – definisce anche «razionalità strumentale» o «pensiero calcolante». A partire da una tale visione di fondo, cerco quindi di fornire un’analisi della concezione gadameriana della tecnoscienza, evidenziando al contempo alcuni aspetti, e cioè: primo, come l’ermeneutica filosofica di Gadamer non vada interpretata come una filosofia unilateralmente antiscientifica, bensì piuttosto come una filosofia antiscientista (il che naturalmente è qualcosa di ben diverso); secondo, come la sua ricostruzione della crisi della modernità non sfoci mai in una critica “totalizzante” della ragione, né in una filosofia della storia pessimistico-negativa incentrata sull’idea di un corso ineluttabile degli eventi guidato da una razionalità “irrazionale” e contaminata dalla brama di potere e di dominio; terzo, infine, come la filosofia di Gadamer – a dispetto delle inveterate interpretazioni che sono solite scorgervi un pensiero tradizionalista, autoritario e radicalmente anti-illuminista – non intenda affatto respingere l’illuminismo scientifico moderno tout court, né rinnegarne le più importanti conquiste, ma più semplicemente “correggerne” alcune tendenze e recuperare una nozione più ampia e comprensiva di ragione, in grado di render conto anche di quegli aspetti dell’esperienza umana che, agli occhi di una razionalità “limitata” come quella scientista, non possono che apparire come meri residui di irrazionalità. Dopo aver così esaminato nelle prime due sezioni quella che possiamo definire la pars destruens della filosofia di Gadamer, nella terza ed ultima sezione – intitolata Una terapia per la crisi della modernità: la riscoperta dell’esperienza e del sapere pratico – passo quindi ad esaminare la sua pars costruens, consistente a mio giudizio in un recupero critico di quello che egli chiama «un altro tipo di sapere». Ossia, in un tentativo di riabilitazione di tutte quelle forme pre- ed extra-scientifiche di sapere e di esperienza che Gadamer considera costitutive della «dimensione ermeneutica» dell’esistenza umana. La mia analisi della concezione gadameriana del Verstehen e dell’Erfahrung – in quanto forme di un «sapere pratico (praktisches Wissen)» differente in linea di principio da quello teorico e tecnico – conduce quindi ad un’interpretazione complessiva dell’ermeneutica filosofica come vera e propria filosofia pratica. Cioè, come uno sforzo di chiarificazione filosofica di quel sapere prescientifico, intersoggettivo e “di senso comune” effettivamente vigente nella sfera della nostra Lebenswelt e della nostra esistenza pratica. Ciò, infine, conduce anche inevitabilmente ad un’accentuazione dei risvolti etico-politici dell’ermeneutica di Gadamer. In particolare, cerco di esaminare la concezione gadameriana dell’etica – tenendo conto dei suoi rapporti con le dottrine morali di Platone, Aristotele, Kant e Hegel – e di delineare alla fine un profilo della sua ermeneutica filosofica come filosofia del dialogo, della solidarietà e della libertà.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Effects of the conflict between reason and passion in Bernard Mandeville’s moral, economic and political thought My PhD dissertation focuses on Bernard Mandeville (1670-1732), a Dutch philosopher who moved to London in his late twenties. The aspect of Mandeville’s thought I take into account in my research is the conflicting relation between reason and passions, and the consequences that Mandeville’s view of this conflict has in the development of his theory of human nature which, I argue, is what grounds his moral, economic and, above all, political theory. According to Mandeville, reason is fundamentally weak. Passions influence with more strength human actions, and, eventually, are the ones which motivate them. The role of reason is merely instrumental, restricted to finding appropriate means in order to reach the desired ends, which are capricious and inconstant, since they all come from unstable passions. Reason cannot take decisions meant to act in the long term, pursuing an object which has not a selfish and temporary nature. There is no possibility, thus, that men’s actions aim just to achieve a good and just society, without their interests being directly involved. The basically selfish root of every desire leads Mandeville to claim that there is neither benevolence nor altruism which guides human behaviour. Hence he expresses a judgement on the moral character of human beings, always busy with their self-satisfaction, and hardly ever considering what would be good on a wider perspective, including other people’s sake. The anthropological features ascribed to men by Mandeville, are those which lead him to prefer a political system where governors are not supposed to have particular abilities, either from an intellectual or from a moral point of view, and peace and order are preserved by the bureaucratic machine, which is meant to work with the least effort on the part of the politicians, and no big harm can be done even by corrupted or wicked governors. This system is adopted with an eye at remedying human deficiencies: Mandeville takes into primary account, when he thinks of how to build a peaceful and functioning society, that everyone is concerned with his selfish interest, and that the rationality of a single politician, or of a group of them belonging to a same generation, cannot find a good “solution” to govern men able to last over the long period, and to work in different ages. This implies a refusal of the Hobbesian theory of the pactum subjectionis, which has the character of a rational and definitive choice, and leads Mandeville to consider the order which arises spontaneously, without any plan or rational intervention.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This PhD thesis describes the application of some instrumental analytical techniques suitable to the study of fundamental food products for the human diet, such as: extra virgin olive oil and dairy products. These products, widely spread in the market and with high nutritional values, are increasingly recognized healthy properties although their lipid fraction might contain some unfavorable components to the human health. The research activity has been structured in the following investigations: “Comparison of different techniques for trans fatty acids analysis” “Fatty acids analysis of outcrop milk cream samples, with particular emphasis on the content of Conjugated Linoleic Acid (CLA) and trans Fatty Acids (TFA), by using 100m high-polarity capillary column” “Evaluation of the oxidited fatty acids (OFA) content during the Parmigiano-Reggiano cheese seasoning” “Direct analysis of 4-desmethyl sterols and two dihydroxy triterpenes in saponified vegetal oils (olive oil and others) using liquid chromatography-mass spectrometry” “Quantitation of long chain poly-unsatured fatty acids (LC-PUFA) in base infant formulas by Gas Chromatography, and evaluation of the blending phases accuracy during their preparation” “Fatty acids composition of Parmigiano Reggiano cheese samples, with emphasis on trans isomers (TFA)”

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The treatment of the Cerebral Palsy (CP) is considered as the “core problem” for the whole field of the pediatric rehabilitation. The reason why this pathology has such a primary role, can be ascribed to two main aspects. First of all CP is the form of disability most frequent in childhood (one new case per 500 birth alive, (1)), secondarily the functional recovery of the “spastic” child is, historically, the clinical field in which the majority of the therapeutic methods and techniques (physiotherapy, orthotic, pharmacologic, orthopedic-surgical, neurosurgical) were first applied and tested. The currently accepted definition of CP – Group of disorders of the development of movement and posture causing activity limitation (2) – is the result of a recent update by the World Health Organization to the language of the International Classification of Functioning Disability and Health, from the original proposal of Ingram – A persistent but not unchangeable disorder of posture and movement – dated 1955 (3). This definition considers CP as a permanent ailment, i.e. a “fixed” condition, that however can be modified both functionally and structurally by means of child spontaneous evolution and treatments carried out during childhood. The lesion that causes the palsy, happens in a structurally immature brain in the pre-, peri- or post-birth period (but only during the firsts months of life). The most frequent causes of CP are: prematurity, insufficient cerebral perfusion, arterial haemorrhage, venous infarction, hypoxia caused by various origin (for example from the ingestion of amniotic liquid), malnutrition, infection and maternal or fetal poisoning. In addition to these causes, traumas and malformations have to be included. The lesion, whether focused or spread over the nervous system, impairs the whole functioning of the Central Nervous System (CNS). As a consequence, they affect the construction of the adaptive functions (4), first of all posture control, locomotion and manipulation. The palsy itself does not vary over time, however it assumes an unavoidable “evolutionary” feature when during growth the child is requested to meet new and different needs through the construction of new and different functions. It is essential to consider that clinically CP is not only a direct expression of structural impairment, that is of etiology, pathogenesis and lesion timing, but it is mainly the manifestation of the path followed by the CNS to “re”-construct the adaptive functions “despite” the presence of the damage. “Palsy” is “the form of the function that is implemented by an individual whose CNS has been damaged in order to satisfy the demands coming from the environment” (4). Therefore it is only possible to establish general relations between lesion site, nature and size, and palsy and recovery processes. It is quite common to observe that children with very similar neuroimaging can have very different clinical manifestations of CP and, on the other hand, children with very similar motor behaviors can have completely different lesion histories. A very clear example of this is represented by hemiplegic forms, which show bilateral hemispheric lesions in a high percentage of cases. The first section of this thesis is aimed at guiding the interpretation of CP. First of all the issue of the detection of the palsy is treated from historical viewpoint. Consequently, an extended analysis of the current definition of CP, as internationally accepted, is provided. The definition is then outlined in terms of a space dimension and then of a time dimension, hence it is highlighted where this definition is unacceptably lacking. The last part of the first section further stresses the importance of shifting from the traditional concept of CP as a palsy of development (defect analysis) towards the notion of development of palsy, i.e., as the product of the relationship that the individual however tries to dynamically build with the surrounding environment (resource semeiotics) starting and growing from a different availability of resources, needs, dreams, rights and duties (4). In the scientific and clinic community no common classification system of CP has so far been universally accepted. Besides, no standard operative method or technique have been acknowledged to effectively assess the different disabilities and impairments exhibited by children with CP. CP is still “an artificial concept, comprising several causes and clinical syndromes that have been grouped together for a convenience of management” (5). The lack of standard and common protocols able to effectively diagnose the palsy, and as a consequence to establish specific treatments and prognosis, is mainly because of the difficulty to elevate this field to a level based on scientific evidence. A solution aimed at overcoming the current incomplete treatment of CP children is represented by the clinical systematic adoption of objective tools able to measure motor defects and movement impairments. A widespread application of reliable instruments and techniques able to objectively evaluate both the form of the palsy (diagnosis) and the efficacy of the treatments provided (prognosis), constitutes a valuable method able to validate care protocols, establish the efficacy of classification systems and assess the validity of definitions. Since the ‘80s, instruments specifically oriented to the analysis of the human movement have been advantageously designed and applied in the context of CP with the aim of measuring motor deficits and, especially, gait deviations. The gait analysis (GA) technique has been increasingly used over the years to assess, analyze, classify, and support the process of clinical decisions making, allowing for a complete investigation of gait with an increased temporal and spatial resolution. GA has provided a basis for improving the outcome of surgical and nonsurgical treatments and for introducing a new modus operandi in the identification of defects and functional adaptations to the musculoskeletal disorders. Historically, the first laboratories set up for gait analysis developed their own protocol (set of procedures for data collection and for data reduction) independently, according to performances of the technologies available at that time. In particular, the stereophotogrammetric systems mainly based on optoelectronic technology, soon became a gold-standard for motion analysis. They have been successfully applied especially for scientific purposes. Nowadays the optoelectronic systems have significantly improved their performances in term of spatial and temporal resolution, however many laboratories continue to use the protocols designed on the technology available in the ‘70s and now out-of-date. Furthermore, these protocols are not coherent both for the biomechanical models and for the adopted collection procedures. In spite of these differences, GA data are shared, exchanged and interpreted irrespectively to the adopted protocol without a full awareness to what extent these protocols are compatible and comparable with each other. Following the extraordinary advances in computer science and electronics, new systems for GA no longer based on optoelectronic technology, are now becoming available. They are the Inertial and Magnetic Measurement Systems (IMMSs), based on miniature MEMS (Microelectromechanical systems) inertial sensor technology. These systems are cost effective, wearable and fully portable motion analysis systems, these features gives IMMSs the potential to be used both outside specialized laboratories and to consecutive collect series of tens of gait cycles. The recognition and selection of the most representative gait cycle is then easier and more reliable especially in CP children, considering their relevant gait cycle variability. The second section of this thesis is focused on GA. In particular, it is firstly aimed at examining the differences among five most representative GA protocols in order to assess the state of the art with respect to the inter-protocol variability. The design of a new protocol is then proposed and presented with the aim of achieving gait analysis on CP children by means of IMMS. The protocol, named ‘Outwalk’, contains original and innovative solutions oriented at obtaining joint kinematic with calibration procedures extremely comfortable for the patients. The results of a first in-vivo validation of Outwalk on healthy subjects are then provided. In particular, this study was carried out by comparing Outwalk used in combination with an IMMS with respect to a reference protocol and an optoelectronic system. In order to set a more accurate and precise comparison of the systems and the protocols, ad hoc methods were designed and an original formulation of the statistical parameter coefficient of multiple correlation was developed and effectively applied. On the basis of the experimental design proposed for the validation on healthy subjects, a first assessment of Outwalk, together with an IMMS, was also carried out on CP children. The third section of this thesis is dedicated to the treatment of walking in CP children. Commonly prescribed treatments in addressing gait abnormalities in CP children include physical therapy, surgery (orthopedic and rhizotomy), and orthoses. The orthotic approach is conservative, being reversible, and widespread in many therapeutic regimes. Orthoses are used to improve the gait of children with CP, by preventing deformities, controlling joint position, and offering an effective lever for the ankle joint. Orthoses are prescribed for the additional aims of increasing walking speed, improving stability, preventing stumbling, and decreasing muscular fatigue. The ankle-foot orthosis (AFO), with a rigid ankle, are primarily designed to prevent equinus and other foot deformities with a positive effect also on more proximal joints. However, AFOs prevent the natural excursion of the tibio-tarsic joint during the second rocker, hence hampering the natural leaning progression of the whole body under the effect of the inertia (6). A new modular (submalleolar) astragalus-calcanear orthosis, named OMAC, has recently been proposed with the intention of substituting the prescription of AFOs in those CP children exhibiting a flat and valgus-pronated foot. The aim of this section is thus to present the mechanical and technical features of the OMAC by means of an accurate description of the device. In particular, the integral document of the deposited Italian patent, is provided. A preliminary validation of OMAC with respect to AFO is also reported as resulted from an experimental campaign on diplegic CP children, during a three month period, aimed at quantitatively assessing the benefit provided by the two orthoses on walking and at qualitatively evaluating the changes in the quality of life and motor abilities. As already stated, CP is universally considered as a persistent but not unchangeable disorder of posture and movement. Conversely to this definition, some clinicians (4) have recently pointed out that movement disorders may be primarily caused by the presence of perceptive disorders, where perception is not merely the acquisition of sensory information, but an active process aimed at guiding the execution of movements through the integration of sensory information properly representing the state of one’s body and of the environment. Children with perceptive impairments show an overall fear of moving and the onset of strongly unnatural walking schemes directly caused by the presence of perceptive system disorders. The fourth section of the thesis thus deals with accurately defining the perceptive impairment exhibited by diplegic CP children. A detailed description of the clinical signs revealing the presence of the perceptive impairment, and a classification scheme of the clinical aspects of perceptual disorders is provided. In the end, a functional reaching test is proposed as an instrumental test able to disclosure the perceptive impairment. References 1. Prevalence and characteristics of children with cerebral palsy in Europe. Dev Med Child Neurol. 2002 Set;44(9):633-640. 2. Bax M, Goldstein M, Rosenbaum P, Leviton A, Paneth N, Dan B, et al. Proposed definition and classification of cerebral palsy, April 2005. Dev Med Child Neurol. 2005 Ago;47(8):571-576. 3. Ingram TT. A study of cerebral palsy in the childhood population of Edinburgh. Arch. Dis. Child. 1955 Apr;30(150):85-98. 4. Ferrari A, Cioni G. The spastic forms of cerebral palsy : a guide to the assessment of adaptive functions. Milan: Springer; 2009. 5. Olney SJ, Wright MJ. Cerebral Palsy. Campbell S et al. Physical Therapy for Children. 2nd Ed. Philadelphia: Saunders. 2000;:533-570. 6. Desloovere K, Molenaers G, Van Gestel L, Huenaerts C, Van Campenhout A, Callewaert B, et al. How can push-off be preserved during use of an ankle foot orthosis in children with hemiplegia? A prospective controlled study. Gait Posture. 2006 Ott;24(2):142-151.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The consumer demand for natural, minimally processed, fresh like and functional food has lead to an increasing interest in emerging technologies. The aim of this PhD project was to study three innovative food processing technologies currently used in the food sector. Ultrasound-assisted freezing, vacuum impregnation and pulsed electric field have been investigated through laboratory scale systems and semi-industrial pilot plants. Furthermore, analytical and sensory techniques have been developed to evaluate the quality of food and vegetable matrix obtained by traditional and emerging processes. Ultrasound was found to be a valuable technique to improve the freezing process of potatoes, anticipating the beginning of the nucleation process, mainly when applied during the supercooling phase. A study of the effects of pulsed electric fields on phenol and enzymatic profile of melon juice has been realized and the statistical treatment of data was carried out through a response surface method. Next, flavour enrichment of apple sticks has been realized applying different techniques, as atmospheric, vacuum, ultrasound technologies and their combinations. The second section of the thesis deals with the development of analytical methods for the discrimination and quantification of phenol compounds in vegetable matrix, as chestnut bark extracts and olive mill waste water. The management of waste disposal in mill sector has been approached with the aim of reducing the amount of waste, and at the same time recovering valuable by-products, to be used in different industrial sectors. Finally, the sensory analysis of boiled potatoes has been carried out through the development of a quantitative descriptive procedure for the study of Italian and Mexican potato varieties. An update on flavour development in fresh and cooked potatoes has been realized and a sensory glossary, including general and specific definitions related to organic products, used in the European project Ecropolis, has been drafted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food suppliers currently measure apple quality considering basic pomological descriptors. Sensory analysis is expensive, does not permit to analyse many samples, and cannot be implemented for measuring quality properties in real time. However, sensory analysis is the best way to precisely describe food eating quality, since it is able to define, measure, and explain what is really perceivable by human senses and using a language that closely reflects the consumers’ perception. On the basis of such observations, we developed a detailed protocol for apple sensory profiling by descriptive sensory analysis and instrumental measurements. The collected sensory data were validated by applying rigorous scientific criteria for sensory analysis. The method was then applied for studying sensory properties of apples and their changes in relation to different pre- and post-harvest factors affecting fruit quality, and demonstrated to be able to discriminate fruit varieties and to highlight differences in terms of sensory properties. The instrumental measurements confirmed such results. Moreover, the correlation between sensory and instrumental data was studied, and a new effective approach was defined for the reliable prediction of sensory properties by instrumental characterisation. It is therefore possible to propose the application of this sensory-instrumental tool to all the stakeholders involved in apple production and marketing, to have a reliable description of apple fruit quality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L’effettività della tutela cautelare, intesa come tutela tempestiva, adeguata e piena, è stata la linea cardine dell’evoluzione della giustizia amministrativa, che, nel corso di un periodo durato più di un secolo, grazie all’opera della giurisprudenza e della dottrina, si è strutturata oggi su un vero processo. Approdo recente, e allo stesso tempo, simbolo di questa evoluzione, è sicuramente il Codice del processo amministrativo emanato con il d. lgs. 2 luglio 2010, n. 104. La ricerca, di cui questo contributo costituisce un resoconto, è iniziata contestualmente all’entrata in vigore del nuovo testo, e quindi è stata anche l’occasione per vederne le prime applicazioni giurisprudenziali. In particolare la lettura del Codice, prescindendo da una mera ricognizione di tutto il suo lungo articolato, è stata fatta alla luce di una ponderazione, nell’attualità, non solo del principio di effettività, ma anche del principio di strumentalità che lega tradizionalmente la fase cautelare con la fase di merito. I risultati della ricerca manifestano la volontà del legislatore di confermare questo rapporto strumentale, per fronteggiare una deriva incontrollata verso una cautela dagli effetti alle volte irreversibili, quale verificatasi nell’applicazione giurisprudenziale, ma contestualmente evidenziano la volontà di estendere la portata della tutela cautelare. Guardando a cosa sia diventata oggi la tutela cautelare, si è assistito ad un rafforzamento degli effetti conformativi, tipici delle sentenze di merito ma che si sono estesi alla fase cautelare. I giudici, pur consapevoli che la tutela cautelare non sia una risposta a cognizione piena, bensì sommaria, intendono comunque garantire una tutela tempestiva ed effettiva, anche per il tramite di tecniche processuali particolari, come quella del remand, cui, all’interno della ricerca, viene dedicato ampio spazio. Nella sua ultima parte la ricerca si è focalizzata, sempre volendo guardare in maniera globale agli effetti della tutela cautelare, sul momento dell’esecuzione e quindi sul giudizio di ottemperanza.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

L'intervento di connessione cavo-polmonare totale (TCPC) nei pazienti portatori di cuore univentricolare, a causa della particolare condizione emodinamica, determina un risentimento a carico di numerosi parenchimi. Scopo della ricerca è di valutare l'entità di questo danno ad un follow-up medio-lungo. Sono stati arruolati 115 pazienti, sottoposti ad intervento presso i centri di Cardiochirurgia Pediatrica di Bologna (52 pz) e Torino (63 pz). Il follow-up medio è stato di 125±2 mesi. I pazienti sono stati sottoposti ad indagine emodinamica (88 pz), test cardiopolmonare (75 pz) e Fibroscan ed ecografia epatica (47 pz). La pressione polmonare media è stata di 11.5±2.6mmHg, ed in 12 pazienti i valori di pressione polmonare erano superiori a 15mmHg. La pressione atriale media era di 6.7±2.3mmHg ed il calcolo delle resistenze vascolari polmonari indicizzate (RVP) era in media di 2±0.99 UW/m2. In 29 pazienti le RVP erano superiori a 2 UW/m2. La VO2 max in media era pari a 28±31 ml/Kg/min, 58±15 % del valore teorico. La frequenza cardiaca massima all'apice dello sforzo era di 151±22 bpm, pari al 74±17% del valore teorico. Il Fibroscan ha fornito un valore medio di 17.01 kPa (8-34.3kPa). Cinque pazienti erano in classe F2, 9 pazienti in classe F3 e 33 pazienti risultavano in classe F4. Nei pazienti con follow-up maggiore di 10 anni il valore di stiffness epatica (19.6±5.2kPa) è risultato significativamente maggiore a quello dei pazienti con follow-up minore di 10 anni (15.1±5.8kPa, p<0.01). La frequenza cardiaca massima raggiunta durante lo sforzo del test cardiopolmonare è risultata significativamente correlata alla morfologia del ventricolo unico, risultando del 67.8±14.4% del valore teorico nei pazienti portatori di ventricolo destro contro il 79.6±8.7% dei portatori di ventricolo sinistro (p=0.006). L'intervento di TCPC determina un risentimento a carico di numerosi parenchimi proporzionale alla lunghezza del follow-up, e necessita pertanto un costante monitoraggio clinico-strumentale multidisciplinare.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An extensive study of the morphology and the dynamics of the equatorial ionosphere over South America is presented here. A multi parametric approach is used to describe the physical characteristics of the ionosphere in the regions where the combination of the thermospheric electric field and the horizontal geomagnetic field creates the so-called Equatorial Ionization Anomalies. Ground based measurements from GNSS receivers are used to link the Total Electron Content (TEC), its spatial gradients and the phenomenon known as scintillation that can lead to a GNSS signal degradation or even to a GNSS signal ‘loss of lock’. A new algorithm to highlight the features characterizing the TEC distribution is developed in the framework of this thesis and the results obtained are validated and used to improve the performance of a GNSS positioning technique (long baseline RTK). In addition, the correlation between scintillation and dynamics of the ionospheric irregularities is investigated. By means of a software, here implemented, the velocity of the ionospheric irregularities is evaluated using high sampling rate GNSS measurements. The results highlight the parallel behaviour of the amplitude scintillation index (S4) occurrence and the zonal velocity of the ionospheric irregularities at least during severe scintillations conditions (post-sunset hours). This suggests that scintillations are driven by TEC gradients as well as by the dynamics of the ionospheric plasma. Finally, given the importance of such studies for technological applications (e.g. GNSS high-precision applications), a validation of the NeQuick model (i.e. the model used in the new GALILEO satellites for TEC modelling) is performed. The NeQuick performance dramatically improves when data from HF radar sounding (ionograms) are ingested. A custom designed algorithm, based on the image recognition technique, is developed to properly select the ingested data, leading to further improvement of the NeQuick performance.