117 resultados para TDP, Travelling Deliveryman Problem, Algoritmi di ottimizzazione


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Il lavoro di ricerca tenta di inquadrare sotto nuove prospettive una problematica ormai classica all’interno della semiotica della pubblicità: l’analisi dello spot. I punti chiave del lavoro – e la pretesa di una certa differenza rispetto a lavori con oggetti affini – consistono sostanzialmente in tre aspetti. Innanzitutto, vi è un ritorno alle origini flochiane nella misura in cui non solo il contesto complessivo e le finalità che la ricerca si propone sono fortemente ancorati all’interno di obiettivi di marketing, ma tutto lo studio nella sua interezza nasce dal dialogo concreto tra metodologia di analisi semiotica e prassi concreta all’interno degli istituti di ricerca di mercato. La tesi non presenta quindi una collezione di analisi di testi pubblicitari condotte in modo autoriferito, quanto piuttosto di “messe alla prova” della metodologia, funzionali alla definizione di disegni di ricerca per la marketing research. Questo comporta un dialogo piuttosto stretto con metodologie affini (sociologia qualitativa e quantitativa, psicologia motivazionale, ecc.) nella convinzione che la priorità accordata all’oggetto di analisi sia sovraordinata rispetto all’ortodossia degli strumenti metodologici. In definitiva, lo spot è sempre e comunque analizzato all’interno di una prospettiva brand-centrica che ha ben in mente la semiotica della situazione di consumo rispetto alla quale lo spot agisce da leva di valorizzazione per l’acquisto. In secondo luogo, gli oggetti analizzati sono piuttosto vari e differenziati: non solo lo spot nella sua versione audiovisiva definitiva (il “girato”), ma anche storyboard, animatic, concept (di prodotto e di comunicazione). La prospettiva generativa greimasiana va a innestarsi su problematiche legate (anche) alla genesi dello spot, alla sua progettazione e riprogettazione/ottimizzazione. La tesi mostra quindi come una semiotica per le consulenze di marketing si diriga sul proprio oggetto ponendogli domande ben circoscritte e finalizzate a un obiettivo specifico, sostanzialmente derivato dal brief contenente le intenzioni comunicazionali del cliente-azienda. Infine, pur rimanendo all’interno di una teoria semiotica generativa (sostanzialmente greimasiana e post greimasiana), la ricerca adotta una prospettiva intrinsecamente multidisciplinare che se da un lato guarda a problematiche legate al marketing, al branding e alla comunicazione pubblicitaria e d’impresa tout court, dall’altro ritorna alle teorie dell’audiovisivo, mostrando affinità e differenze rispetto a forme audiovisive standard (il “film”) e a mutuazioni da nuove estetiche (la neotelevisione, il videoclip, ecc). La tesi si mostra solidamente convinta del fatto che per parlare di efficacia discorsiva sia imprescindibile approfondire le tematiche riguardanti il sincretismo espressivo e le specifiche modalità di manifestazione stilistica. In questo contesto, il lavoro si compone di quattro grandi aree tematiche. Dopo una breve introduzione sull’attualità del tema “spot” e sulla prospettiva analiticometodologica adottata (§ 1.), nel secondo capitolo si assume teoreticamente che i contenuti dello spot derivino da una specifica (e di volta in volta diversa) creolizzazione tra domini tematici derivanti dalla marca, dal prodotto (inteso tanto come concept di prodotto, quanto come prodotto già “vestito” di una confezione) e dalle tendenze socioculturali. Le tre dimensioni vengono valutate in relazione all’opposizione tra heritage, cioè continuità rispetto al passato e ai concorrenti e vision, cioè discontinuità rispetto alla propria storia comunicazionale e a quella dei concorrenti. Si esplorano inoltre altri fattori come il testimonial-endorser che, in quanto elemento già intrinsecamente foriero di elementi di valorizzazione, va a influire in modo rilevante sul complesso tematico e assiologico della pubblicità. Essendo la sezione della tesi che prende in considerazione il piano specificatamente contenutistico dello spot, questa parte diventa quindi anche l’occasione per ritornare sul modello delle assiologie del consumo di Jean-Marie Floch, approntando alcune critiche e difendendo invece un modello che – secondo la prospettiva qui esposta – contiene punti di attualità ineludibili rispetto a schematizzazioni che gli sono successive e in qualche modo debitrici. Segue una sezione (§ 3.) specificatamente dedicata allo svolgimento e dis-implicazione del sincretismo audiovisivo e quindi – specularmente alla precedente, dedicata alle forme e sostanze del contenuto – si concentra sulle dinamiche espressive. Lo spot viene quindi analizzato in quanto “forma testuale” dotata di alcune specificità, tra cui in primis la brevità. Inoltre vengono approfondite le problematiche legate all’apporto di ciascuna specifica sostanza: il rapporto tra visivo e sonoro, lo schermo e la sua multiprospetticità sempre più evidente, il “lavoro” di punteggiatura della musica, ecc. E su tutto il concetto dominante di montaggio, intrinsecamente unito a quello di ritmo. Il quarto capitolo ritorna in modo approfondito sul rapporto tra semiotica e ricerca di mercato, analizzando sia i rapporti di reciproca conoscenza (o non conoscenza), sia i nuovi spazi di intervento dell’analisi semiotica. Dopo aver argomentato contro un certo scetticismo circa l’utilità pragmatica dell’analisi semiotica, lo studio prende in esame i tradizionali modelli di valutazione e misurazione dell’efficacia pubblicitaria (pre- e post- test) cercando di semiotizzarne il portato. Ne consegue la proposta di disegni di ricerca semiotici modulari: integrabili tra loro e configurabili all’interno di progetti semio-quali-quantitativi. Dopo aver ridefinito le possibilità di un’indagine semiotica sui parametri di efficacia discorsiva, si procede con l’analisi di un caso concreto (§ 5.): dato uno spot che si è dimostrato efficace agli occhi dell’azienda committente, quali possono essere i modi per replicarne i fattori di successo? E come spiegare invece quelli di insuccesso delle campagne successive che – almeno teoricamente – erano pensate per capitalizzare l’efficacia della prima? Non si tratta quindi di una semiotica ingenuamente chiamata a “misurare” l’efficacia pubblicitaria, che evidentemente la marketing research analizza con strumenti quantitativi assodati e fondati su paradigmi di registrazione di determinati parametri sul consumatore (ricordo spontaneo e sollecitato, immagine di marca risultante nella mente di user e prospect consumer, intenzione d’acquisto stimolata). Piuttosto l’intervento qui esposto si preoccupa più funzionalmente a spiegare quali elementi espressivi, discorsivi, narrativi, siano stati responsabili (e quindi prospetticamente potranno condizionare in positivo o in negativo in futuro) la ricezione dello spot. L’analisi evidenzia come elementi apparentemente minimali, ancorati a differenti livelli di pertinenza siano in grado di determinare una notevole diversità negli effetti di senso. Si tratta quindi di un problema di mancata coerenza tra intenzioni comunicative e testo pubblicitario effettivamente realizzato. La risoluzione di tali questioni pragmatiche conduce ad approfondimenti teoricometodologici su alcuni versanti particolarmente interessanti. In primo luogo, ci si interroga sull’apporto della dimensione passionale nella costruzione dell’efficacia e nel coinvolgimento dello spettatore/consumatore. Inoltre – e qui risiede uno dei punti di maggior sintesi del lavoro di tesi – si intraprende una proficua discussione dei modelli di tipizzazione dei generi pubblicitari, intesi come forme discorsive. Si fanno quindi dialogare modelli diversi ma in qualche misura coestensivi e sovrapponibili come quelli di Jean Marie Floch, Guido Ferraro, Cosetta Saba e Chiara Giaccardi. Si perviene così alla costruzione di un nuovo modello sintetico, idealmente onnipervasivo e trasversale alle prospettive analizzate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wheel - rail contact analysis plays a fundamental role in the multibody modeling of railway vehicles. A good contact model must provide an accurate description of the global contact phenomena (contact forces and torques, number and position of the contact points) and of the local contact phenomena (position and shape of the contact patch, stresses and displacements). The model has also to assure high numerical efficiency (in order to be implemented directly online within multibody models) and a good compatibility with commercial multibody software (Simpack Rail, Adams Rail). The wheel - rail contact problem has been discussed by several authors and many models can be found in the literature. The contact models can be subdivided into two different categories: the global models and the local (or differential) models. Currently, as regards the global models, the main approaches to the problem are the so - called rigid contact formulation and the semi – elastic contact description. The rigid approach considers the wheel and the rail as rigid bodies. The contact is imposed by means of constraint equations and the contact points are detected during the dynamic simulation by solving the nonlinear algebraic differential equations associated to the constrained multibody system. Indentation between the bodies is not permitted and the normal contact forces are calculated through the Lagrange multipliers. Finally the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces respectively. Also the semi - elastic approach considers the wheel and the rail as rigid bodies. However in this case no kinematic constraints are imposed and the indentation between the bodies is permitted. The contact points are detected by means of approximated procedures (based on look - up tables and simplifying hypotheses on the problem geometry). The normal contact forces are calculated as a function of the indentation while, as in the rigid approach, the Hertz’s and the Kalker’s theories allow to evaluate the shape of the contact patch and the tangential forces. Both the described multibody approaches are computationally very efficient but their generality and accuracy turn out to be often insufficient because the physical hypotheses behind these theories are too restrictive and, in many circumstances, unverified. In order to obtain a complete description of the contact phenomena, local (or differential) contact models are needed. In other words wheel and rail have to be considered elastic bodies governed by the Navier’s equations and the contact has to be described by suitable analytical contact conditions. The contact between elastic bodies has been widely studied in literature both in the general case and in the rolling case. Many procedures based on variational inequalities, FEM techniques and convex optimization have been developed. This kind of approach assures high generality and accuracy but still needs very large computational costs and memory consumption. Due to the high computational load and memory consumption, referring to the current state of the art, the integration between multibody and differential modeling is almost absent in literature especially in the railway field. However this integration is very important because only the differential modeling allows an accurate analysis of the contact problem (in terms of contact forces and torques, position and shape of the contact patch, stresses and displacements) while the multibody modeling is the standard in the study of the railway dynamics. In this thesis some innovative wheel – rail contact models developed during the Ph. D. activity will be described. Concerning the global models, two new models belonging to the semi – elastic approach will be presented; the models satisfy the following specifics: 1) the models have to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the models have to consider generic railway tracks and generic wheel and rail profiles 3) the models have to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the models have to evaluate the number and the position of the contact points and, for each point, the contact forces and torques 4) the models have to be implementable directly online within the multibody models without look - up tables 5) the models have to assure computation times comparable with those of commercial multibody software (Simpack Rail, Adams Rail) and compatible with RT and HIL applications 6) the models have to be compatible with commercial multibody software (Simpack Rail, Adams Rail). The most innovative aspect of the new global contact models regards the detection of the contact points. In particular both the models aim to reduce the algebraic problem dimension by means of suitable analytical techniques. This kind of reduction allows to obtain an high numerical efficiency that makes possible the online implementation of the new procedure and the achievement of performance comparable with those of commercial multibody software. At the same time the analytical approach assures high accuracy and generality. Concerning the local (or differential) contact models, one new model satisfying the following specifics will be presented: 1) the model has to be 3D and to consider all the six relative degrees of freedom between wheel and rail 2) the model has to consider generic railway tracks and generic wheel and rail profiles 3) the model has to assure a general and accurate handling of the multiple contact without simplifying hypotheses on the problem geometry; in particular the model has to able to calculate both the global contact variables (contact forces and torques) and the local contact variables (position and shape of the contact patch, stresses and displacements) 4) the model has to be implementable directly online within the multibody models 5) the model has to assure high numerical efficiency and a reduced memory consumption in order to achieve a good integration between multibody and differential modeling (the base for the local contact models) 6) the model has to be compatible with commercial multibody software (Simpack Rail, Adams Rail). In this case the most innovative aspects of the new local contact model regard the contact modeling (by means of suitable analytical conditions) and the implementation of the numerical algorithms needed to solve the discrete problem arising from the discretization of the original continuum problem. Moreover, during the development of the local model, the achievement of a good compromise between accuracy and efficiency turned out to be very important to obtain a good integration between multibody and differential modeling. At this point the contact models has been inserted within a 3D multibody model of a railway vehicle to obtain a complete model of the wagon. The railway vehicle chosen as benchmark is the Manchester Wagon the physical and geometrical characteristics of which are easily available in the literature. The model of the whole railway vehicle (multibody model and contact model) has been implemented in the Matlab/Simulink environment. The multibody model has been implemented in SimMechanics, a Matlab toolbox specifically designed for multibody dynamics, while, as regards the contact models, the CS – functions have been used; this particular Matlab architecture allows to efficiently connect the Matlab/Simulink and the C/C++ environment. The 3D multibody model of the same vehicle (this time equipped with a standard contact model based on the semi - elastic approach) has been then implemented also in Simpack Rail, a commercial multibody software for railway vehicles widely tested and validated. Finally numerical simulations of the vehicle dynamics have been carried out on many different railway tracks with the aim of evaluating the performances of the whole model. The comparison between the results obtained by the Matlab/ Simulink model and those obtained by the Simpack Rail model has allowed an accurate and reliable validation of the new contact models. In conclusion to this brief introduction to my Ph. D. thesis, we would like to thank Trenitalia and the Regione Toscana for the support provided during all the Ph. D. activity. Moreover we would also like to thank the INTEC GmbH, the society the develops the software Simpack Rail, with which we are currently working together to develop innovative toolboxes specifically designed for the wheel rail contact analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Generic programming is likely to become a new challenge for a critical mass of developers. Therefore, it is crucial to refine the support for generic programming in mainstream Object-Oriented languages — both at the design and at the implementation level — as well as to suggest novel ways to exploit the additional degree of expressiveness made available by genericity. This study is meant to provide a contribution towards bringing Java genericity to a more mature stage with respect to mainstream programming practice, by increasing the effectiveness of its implementation, and by revealing its full expressive power in real world scenario. With respect to the current research setting, the main contribution of the thesis is twofold. First, we propose a revised implementation for Java generics that greatly increases the expressiveness of the Java platform by adding reification support for generic types. Secondly, we show how Java genericity can be leveraged in a real world case-study in the context of the multi-paradigm language integration. Several approaches have been proposed in order to overcome the lack of reification of generic types in the Java programming language. Existing approaches tackle the problem of reification of generic types by defining new translation techniques which would allow for a runtime representation of generics and wildcards. Unfortunately most approaches suffer from several problems: heterogeneous translations are known to be problematic when considering reification of generic methods and wildcards. On the other hand, more sophisticated techniques requiring changes in the Java runtime, supports reified generics through a true language extension (where clauses) so that backward compatibility is compromised. In this thesis we develop a sophisticated type-passing technique for addressing the problem of reification of generic types in the Java programming language; this approach — first pioneered by the so called EGO translator — is here turned into a full-blown solution which reifies generic types inside the Java Virtual Machine (JVM) itself, thus overcoming both performance penalties and compatibility issues of the original EGO translator. Java-Prolog integration Integrating Object-Oriented and declarative programming has been the subject of several researches and corresponding technologies. Such proposals come in two flavours, either attempting at joining the two paradigms, or simply providing an interface library for accessing Prolog declarative features from a mainstream Object-Oriented languages such as Java. Both solutions have however drawbacks: in the case of hybrid languages featuring both Object-Oriented and logic traits, such resulting language is typically too complex, thus making mainstream application development an harder task; in the case of library-based integration approaches there is no true language integration, and some “boilerplate code” has to be implemented to fix the paradigm mismatch. In this thesis we develop a framework called PatJ which promotes seamless exploitation of Prolog programming in Java. A sophisticated usage of generics/wildcards allows to define a precise mapping between Object-Oriented and declarative features. PatJ defines a hierarchy of classes where the bidirectional semantics of Prolog terms is modelled directly at the level of the Java generic type-system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data coming out from various researches carried out over the last years in Italy on the problem of school dispersion in secondary school show that difficulty in studying mathematics is one of the most frequent reasons of discomfort reported by students. Nevertheless, it is definitely unrealistic to think we can do without such knowledge in today society: mathematics is largely taught in secondary school and it is not confined within technical-scientific courses only. It is reasonable to say that, although students may choose academic courses that are, apparently, far away from mathematics, all students will have to come to terms, sooner or later in their life, with this subject. Among the reasons of discomfort given by the study of mathematics, some mention the very nature of this subject and in particular the complex symbolic language through which it is expressed. In fact, mathematics is a multimodal system composed by oral and written verbal texts, symbol expressions, such as formulae and equations, figures and graphs. For this, the study of mathematics represents a real challenge to those who suffer from dyslexia: this is a constitutional condition limiting people performances in relation to the activities of reading and writing and, in particular, to the study of mathematical contents. Here the difficulties in working with verbal and symbolic codes entail, in turn, difficulties in the comprehension of texts from which to deduce operations that, once combined together, would lead to the problem final solution. Information technologies may support this learning disorder effectively. However, these tools have some implementation limits, restricting their use in the study of scientific subjects. Vocal synthesis word processors are currently used to compensate difficulties in reading within the area of classical studies, but they are not used within the area of mathematics. This is because the vocal synthesis (or we should say the screen reader supporting it) is not able to interpret all that is not textual, such as symbols, images and graphs. The DISMATH software, which is the subject of this project, would allow dyslexic users to read technical-scientific documents with the help of a vocal synthesis, to understand the spatial structure of formulae and matrixes, to write documents with a technical-scientific content in a format that is compatible with main scientific editors. The system uses LaTex, a text mathematic language, as mediation system. It is set up as LaTex editor, whose graphic interface, in line with main commercial products, offers some additional specific functions with the capability to support the needs of users who are not able to manage verbal and symbolic codes on their own. LaTex is translated in real time into a standard symbolic language and it is read by vocal synthesis in natural language, in order to increase, through the bimodal representation, the ability to process information. The understanding of the mathematic formula through its reading is made possible by the deconstruction of the formula itself and its “tree” representation, so allowing to identify the logical elements composing it. Users, even without knowing LaTex language, are able to write whatever scientific document they need: in fact the symbolic elements are recalled by proper menus and automatically translated by the software managing the correct syntax. The final aim of the project, therefore, is to implement an editor enabling dyslexic people (but not only them) to manage mathematic formulae effectively, through the integration of different software tools, so allowing a better teacher/learner interaction too.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

L’attività della tesi riguarda le protesi mioelettriche, gli arti protesici maggiormente diffusi, le quali sono descrivibili come arti robotici in cui i segmenti artificiali sono attuati da giunti elettromeccanici alimentati da batterie ricaricabili ed attivati mediante segnali elettromiografici (segnali elettrici generati dalla contrazione dei muscoli). Tali protesi di arto superiore attualmente disponibili in commercio potrebbero essere inadeguate per una riabilitazione soddisfacente di alcuni pazienti con una amputazione di alto livello che richiedono una elevata funzionalità nella vita quotidiana. In questo contesto si inserisce l’attività di ricerca del Centro Protesi INAIL di Budrio di Vigorso, Bologna, e dell’Università di Bologna i quali stanno sviluppando nuovi arti protesici con il progetto a lungo termine di rendere disponibili svariate soluzioni di protesi di arto superiore in grado di soddisfare la maggior parte delle richieste degli amputati. Lo scopo di questa tesi è l’introduzione di un nuovo rotatore omerale attivo da integrare alla protesi di arto superiore disponibile presso i nostri laboratori. Per ottenere questo risultato è stata utilizzata una procedura di progettazione già consolidata in attività precedenti per lo sviluppo di una protesi di spalla a due gradi di libertà. Differenti modelli cinematici sono stati studiati tramite analisi cinematiche per determinare l’incremento delle prestazioni a seguito dell’introduzione del nuovo rotatore omerale attivo. Sono state inoltre condotte analisi cinetostatiche per definire le specifiche tecniche di riferimento (in termini di carichi agenti sul rotatore omerale) e per guidare il dimensionamento della catena di trasmissione di potenza del nuovo dispositivo protesico. Ulteriori specifiche tecniche sono state considerate per garantire l’irreversibilità spontanea del moto sotto carichi esterni (quando i giunti attivi della protesi non sono alimentati), per salvaguardare l’incolumità del paziente in caso di caduta, per misurare la posizione angolare del rotatore omerale (in modo da implementare strategie di controllo in retroazione) e per limitare i consumi e la rumorosità del dispositivo. Uno studio di fattibilità ha permesso la selezione della architettura ottimale della catena di trasmissione di potenza per il nuovo rotatore omerale. I criteri di scelta sono stati principalmente la limitazione del peso e dell’ingombro del nuovo dispositivo protesico. Si è quindi proceduto con la progettazione di dettaglio alla quale è seguita la costruzione di un prototipo del nuovo rotatore omerale presso i nostri laboratori. La tesi tratta infine una attività preliminare di sperimentazione che ha permesso di fare considerazioni sulle prestazioni del prototipo ed osservazioni importanti per le successive attività di revisione ed ottimizzazione del progetto del rotatore omerale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rita Cannas presents a PhD thesis in Economics (Geo-Economic curriculum) which is titled “Public Policies for Seasonality in Tourism from a Territorial Perspective. Case Studies in Scotland and Sardinia”. The specific area of the research is public policies for contrasting seasonality in tourism in peripheral areas. Seasonality has seen such as a problem in terms of social and economics patterns especially for those local communities which are situated in peripheral areas. The research explores what, how and for who, public policies, that have been in place in Scotland and Sardinia over the last 10-5 years, are working and what kind of results these have produced. The research has empirical and theoretical implications for studying tourism seasonality. It aims to highlight the local supply patterns of the phenomenon investigated, and to improve knowledge about the strategies and the policies that have been adopted in the two territorial contexts (Scotland and Sardinia) for contrasting or modifying seasonality in tourism. The type of subject and the research questions have suggested the adoption of an interpretative theoretical perspective and a qualitative methodological approach, although a set of quantitative secondary data is also required for understanding main tourism's characteristics and for analyzing the specificity of seasonality. Interview with key actors of the local system in Scotland and Sardinia is the method chosen to collect primary data. In total the researcher has done 20 interviews in deep. Case studies are chosen both as unity of analysis and research strategy. The main findings of the research show a different and complex scenario about quality and quantity of public policies and strategies in tourism in the two case studies. The role of local resources is quite strategic on delivering tourism services and on counteracting seasonality. Events, festival are the main demand-side strategies. From a supply-side the principles policies are focused on quality of services, technology, high skills, sustainability. Partnership between public and private sector seems to be a fundamental way to work in order to attain changes and outcomes. The research has a strong research design, provides coherent results, and it has been done paying attention to the validation of the whole process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nelle attuali organizzazioni sanitarie non è raro trovare operatori sanitari i quali ritengono che la relazione con il paziente consista nell'avere adeguate competenze tecniche inerenti la professione e nell'applicarle con attenzione e diligenza. Peraltro si tende ad invocare il “fattore umano”, ma si lamenta poi che l’operatore si rapporti col paziente in modo asettico e spersonalizzato. Da un punto di vista scientifico il termine “relazione” in psicologia si riferisce essenzialmente ai significati impliciti e quasi sempre non consapevoli veicolati da qualunque relazione: dipende pertanto dalla struttura psichica dei due interlocutori investendo in particolare la sfera dell’affettività e procede per processi comunicativi che travalicano il linguaggio verbale e con esso le intenzioni razionali e coscienti. La relazione interpersonale quindi rientra nel più ampio quadro dei processi di comunicazione: sono questi o meglio i relativi veicoli comunicazionali, che ci dicono della qualità delle relazioni e non viceversa e cioè che i processi comunicazionali vengano regolati in funzione della relazione che si vuole avere (Imbasciati, Margiotta, 2005). Molti studi in materia hanno dimostrato come, oltre alle competenze tecnicamente caratterizzanti la figura dell’infermiere, altre competenze, di natura squisitamente relazionale, giochino un ruolo fondamentale nel processo di ospedalizzazione e nella massimizzazione dell’aderenza al trattamento da parte del paziente, la cui non osservanza è spesso causa di fallimenti terapeutici e origine di aumentati costi sanitari e sociali. Questo aspetto è però spesso messo in discussione a favore di un maggiore accento sugli aspetti tecnico professionali. Da un “modello delle competenze” inteso tecnicisticamente prende origine infatti un protocollo di assistenza infermieristica basato sull’applicazione sistematica del problem solving method: un protocollo preciso (diagnosi e pianificazione) guida l’interazione professionale fra infermiere e la persona assistita. A lato di questa procedura il processo di assistenza infermieristica riconosce però anche un versante relazionale, spesso a torto detto umanistico riferendosi alla soggettività dei protagonisti interagenti: il professionista e il beneficiario dell’assistenza intesi nella loro globalità bio-fisiologica, psicologica e socio culturale. Nel pensiero infermieristico il significato della parola relazione viene però in genere tradotto come corrispondenza continua infermiere-paziente, basata sulle dimensioni personali del bisogno di assistenza infermieristica e caratterizzata da un modo di procedere dialogico e personalizzato centrato però sugli aspetti assistenziali, in cui dall’incontro degli interlocutori si determinerebbe la natura delle cure da fornire ed i mezzi con cui metterle in opera (Colliere, 1992; Motta, 2000). Nell’orientamento infermieristico viene affermata dunque la presenza di una relazione. Ma di che relazione si tratta? Quali sono le capacità necessarie per avere una buona relazione? E cosa si intende per “bisogni personali”? Innanzitutto occorre stabilire cosa sia la buona relazione. La buona o cattiva relazione è il prodotto della modalità con cui l’operatore entra comunque in interazione con il proprio paziente ed è modulata essenzialmente dalle capacità che la sua struttura, consapevole o no, mette in campo. DISEGNO DELLA LA RICERCA – 1° STUDIO Obiettivo del primo studio della presente ricerca, è un’osservazione delle capacità relazionali rilevabili nel rapporto infermiere/paziente, rapporto che si presume essere un caring. Si è voluto fissare l’attenzione principalmente su quelle dimensioni che possono costituire le capacità relazionali dell’infermiere. Questo basandoci anche su un confronto con le aspettative di relazione del paziente e cercando di esplorare quali collegamenti vi siano tra le une e le altre. La relazione e soprattutto la buona relazione non la si può stabilire con la buona volontà, né con la cosiddetta sensibilità umana, ma necessita di capacità che non tutti hanno e che per essere acquisite necessitano di un tipo di formazione che incida sulle strutture profonde della personalità. E’ possibile ipotizzare che la personalità e le sue dimensioni siano il contenitore e gli elementi di base sui quali fare crescere e sviluppare capacità relazionali mature. Le dimensioni di personalità risultano quindi lo snodo principale da cui la ricerca può produrre i suoi risultati e da cui si è orientata per individuare gli strumenti di misura. La motivazione della nostra scelta dello strumento è da ricercare quindi nel tentativo di esplorare l’incidenza delle dimensioni e sottodimensioni di personalità. Tra queste si è ritenuto importante il costrutto dell’Alessitimia, caratteristico nel possesso e quindi nell’utilizzo, più o meno adeguato, di capacità relazionali nel processo di caring,

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis analyses and examines the relevant developments of EU law since the EU institutions have been granted competence in matters of entry and residence of nationals of third countries within the space of the European Union, as governed by Title IV of the Treaty establishing the European Community (now Title V of the Treaty on the Functioning of the European Union) and by the ensuing norms. Based on these data my research aims to reconstruct the current state of EU legislation in matters of entry and residence of third country nationals in order to establish the extent of the EU’s competence into immigration and asylum, also in relation to the erosion of the Member States’ competence into the same areas. The most significant sign of this evolution is the recognition of the right of third-country nationals who are long-term residents to move and reside within the territory of other Member States. The increased use of the EU’s territory by third country nationals has led to the problem of the evolution of the concept of EU citizenship, and in particular to the most significant content of the question, namely the right to move freely. With regard to this aspect EU citizenship could be free from the requirement of nationality of a Member State, so as to be strictly related to the right of free use of the territory, as established by the internal market. This concept could also include the nationals of third countries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis proposes a solution for board cutting in the wood industry with the aim of usage minimization and machine productivity. The problem is dealt with as a Two-Dimensional Cutting Stock Problem and specific Combinatorial Optimization methods are used to solve it considering the features of the real problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this proposal is to explain the paradigm of the American foreign policy during the Johnson Administration, especially toward Europe, within the NATO framework, and toward URSS, in the context of the détente, just emerged during the decade of the sixties. During that period, after the passing of the J. F. Kennedy, President L. B. Johnson inherited a complex and very high-powered world politics, which wanted to get a new phase off the ground in the transatlantic relations and share the burden of the Cold war with a refractory Europe. Known as the grand design, it was a policy that needed the support of the allies and a clear purpose which appealed to the Europeans. At first, President Johnson detected in the problem of the nuclear sharing the good deal to make with the NATO allies. At the same time, he understood that the United States needed to reassert their leadeship within the new stage of relations with the Soviet Union. Soon, the “transatlantic bargain” became something not so easy to dealt with. The Federal Germany wanted to say a word in the nuclear affairs and, why not, put the finger on the trigger of the atlantic nuclear weapons. URSS, on the other hand, wanted to keep Germany down. The other allies did not want to share the onus of the defense of Europe, at most the responsability for the use of the weapons and, at least, to participate in the decision-making process. France, which wanted to detach herself from the policy of the United States and regained a world role, added difficulties to the manage of this course of action. Through the years of the Johnson’s office, the divergences of the policies placed by his advisers to gain the goal put the American foreign policy in deep water. The withdrawal of France from the organization but not from the Alliance, give Washington a chance to carry out his goal. The development of a clear-cut disarm policy leaded the Johnson’s administration to the core of the matter. The Non-proliferation Treaty signed in 1968, solved in a business-like fashion the problem with the allies. The question of nuclear sharing faded away with the acceptance of more deep consultative role in the nuclear affairs by the allies, the burden for the defense of Europe became more bearable through the offset agreement with the FRG and a new doctrine, the flexible response, put an end, at least formally, to the taboo of the nuclear age. The Johnson’s grand design proved to be different from the Kennedy’s one, but all things considered, it was more workable. The unpredictable result was a real détente with the Soviet Union, which, we can say, was a merit of President Johnson.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work is dedicated to the study of damaging phenomena involving reinforced concrete structures and masonry buildings and the consequences in terms of structural performances decay. In the Italian context there are many examples of structures that have already exceeded their service life, considering not only the ancient buildings but also infrastructures and R/C buildings that today are operating from more than 50th years. Climate change which is subject to the entire planet, with changing in seasonal weather and increasing in environmental pollution, is not excluded could have a harmful influence on the rate of building materials decay previously deemed as durables. If the aggressive input changes very fast, for example in a few decades, then it can also change the response of a construction material considered so far durable; in this way the knowledge about the art of good build, consolidated over the centuries, is thwarted. Hence this study is focused on the possibility to define the residual capacity for vertical or seismic loads for structures that are already at the limit of their service life, or for which is impossible to define a service life. The problem in an analysis of this kind, and that is what makes this research different from the main studies avaibles in the literature, is to keep in correlation – in a not so expensive computationally way – issues such as: - dangerous environmental inputs adequately simulated; - environmental conditions favorable to the spread of pollutants and development of the degradation reactions (decay’s speed); - link between environmental degradation and residual bearing capacity A more realistic assessment of materials residual performances that constitute the structure allows to leave the actual system for the residual load-bearing capacity estimation in which all factors are simply considered through the use of a safety factor on the materials properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BTES (borehole thermal energy storage)systems exchange thermal energy by conduction with the surrounding ground through borehole materials. The spatial variability of the geological properties and the space-time variability of hydrogeological conditions affect the real power rate of heat exchangers and, consequently, the amount of energy extracted from / injected into the ground. For this reason, it is not an easy task to identify the underground thermal properties to use when designing. At the current state of technology, Thermal Response Test (TRT) is the in situ test for the characterization of ground thermal properties with the higher degree of accuracy, but it doesn’t fully solve the problem of characterizing the thermal properties of a shallow geothermal reservoir, simply because it characterizes only the neighborhood of the heat exchanger at hand and only for the test duration. Different analytical and numerical models exist for the characterization of shallow geothermal reservoir, but they are still inadequate and not exhaustive: more sophisticated models must be taken into account and a geostatistical approach is needed to tackle natural variability and estimates uncertainty. The approach adopted for reservoir characterization is the “inverse problem”, typical of oil&gas field analysis. Similarly, we create different realizations of thermal properties by direct sequential simulation and we find the best one fitting real production data (fluid temperature along time). The software used to develop heat production simulation is FEFLOW 5.4 (Finite Element subsurface FLOW system). A geostatistical reservoir model has been set up based on literature thermal properties data and spatial variability hypotheses, and a real TRT has been tested. Then we analyzed and used as well two other codes (SA-Geotherm and FV-Geotherm) which are two implementation of the same numerical model of FEFLOW (Al-Khoury model).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fundamental gap in the current understanding of collapsed structures in the universe concerns the thermodynamical evolution of the ordinary, baryonic component. Unopposed radiative cooling of plasma would lead to the cooling catastrophe, a massive inflow of condensing gas toward the centre of galaxies, groups and clusters. The last generation of multiwavelength observations has radically changed our view on baryons, suggesting that the heating linked to the active galactic nucleus (AGN) may be the balancing counterpart of cooling. In this Thesis, I investigate the engine of the heating regulated by the central black hole. I argue that the mechanical feedback, based on massive subrelativistic outflows, is the key to solving the cooling flow problem, i.e. dramatically quenching the cooling rates for several billion years without destroying the cool-core structure. Using an upgraded version of the parallel 3D hydrodynamic code FLASH, I show that anisotropic AGN outflows can further reproduce fundamental observed features, such as buoyant bubbles, cocoon shocks, sonic ripples, metals dredge-up, and subsonic turbulence. The latter is an essential ingredient to drive nonlinear thermal instabilities, which cause cold gas condensation, a residual of the quenched cooling flow and, later, fuel for the AGN feedback engine. The self-regulated outflows are systematically tested on the scales of massive clusters, groups and isolated elliptical galaxies: in lighter less bound objects the feedback needs to be gentler and less efficient, in order to avoid drastic overheating. In this Thesis, I describe in depth the complex hydrodynamics, involving the coupling of the feedback energy to that of the surrounding hot medium. Finally, I present the merits and flaws of all the proposed models, with a critical eye toward observational concordance.