908 resultados para Optimized eco-productive paradigm
Resumo:
A prevalent claim is that we are in knowledge economy. When we talk about knowledge economy, we generally mean the concept of “Knowledge-based economy” indicating the use of knowledge and technologies to produce economic benefits. Hence knowledge is both tool and raw material (people’s skill) for producing some kind of product or service. In this kind of environment economic organization is undergoing several changes. For example authority relations are less important, legal and ownership-based definitions of the boundaries of the firm are becoming irrelevant and there are only few constraints on the set of coordination mechanisms. Hence what characterises a knowledge economy is the growing importance of human capital in productive processes (Foss, 2005) and the increasing knowledge intensity of jobs (Hodgson, 1999). Economic processes are also highly intertwined with social processes: they are likely to be informal and reciprocal rather than formal and negotiated. Another important point is also the problem of the division of labor: as economic activity becomes mainly intellectual and requires the integration of specific and idiosyncratic skills, the task of dividing the job and assigning it to the most appropriate individuals becomes arduous, a “supervisory problem” (Hogdson, 1999) emerges and traditional hierarchical control may result increasingly ineffective. Not only specificity of know how makes it awkward to monitor the execution of tasks, more importantly, top-down integration of skills may be difficult because ‘the nominal supervisors will not know the best way of doing the job – or even the precise purpose of the specialist job itself – and the worker will know better’ (Hogdson,1999). We, therefore, expect that the organization of the economic activity of specialists should be, at least partially, self-organized. The aim of this thesis is to bridge studies from computer science and in particular from Peer-to-Peer Networks (P2P) to organization theories. We think that the P2P paradigm well fits with organization problems related to all those situation in which a central authority is not possible. We believe that P2P Networks show a number of characteristics similar to firms working in a knowledge-based economy and hence that the methodology used for studying P2P Networks can be applied to organization studies. Three are the main characteristics we think P2P have in common with firms involved in knowledge economy: - Decentralization: in a pure P2P system every peer is an equal participant, there is no central authority governing the actions of the single peers; - Cost of ownership: P2P computing implies shared ownership reducing the cost of owing the systems and the content, and the cost of maintaining them; - Self-Organization: it refers to the process in a system leading to the emergence of global order within the system without the presence of another system dictating this order. These characteristics are present also in the kind of firm that we try to address and that’ why we have shifted the techniques we adopted for studies in computer science (Marcozzi et al., 2005; Hales et al., 2007 [39]) to management science.
Resumo:
The research project presented in this dissertation is about text and memory. The title of the work is "Text and memory between Semiotics and Cognitive Science: an experimental setting about remembering a movie". The object of the research is the relationship between texts or "textuality" - using a more general semiotic term - and memory. The goal is to analyze the link between those semiotic artifacts that a culture defines as autonomous meaningful objects - namely texts - and the cognitive performance of memory that allows to remember them. An active dialogue between Semiotics and Cognitive Science is the theoretical paradigm in which this research is set, the major intend is to establish a productive alignment between the "theory of text" developed in Semiotics and the "theory of memory" outlined in Cognitive Science. In particular the research is an attempt to study how human subjects remember and/or misremember a film, as a specific case study; in semiotics, films are “cinematographic texts”. The research is based on the production of a corpus of data gained through the qualitative method of interviewing. After an initial screening of a fulllength feature film each participant of the experiment has been interviewed twice, according to a pre-established set of questions. The first interview immediately after the screening: the subsequent, follow-up interview three months from screening. The purpose of this design is to elicit two types of recall from the participants. In order to conduce a comparative inquiry, three films have been used in the experimental setting. Each film has been watched by thirteen subjects, that have been interviewed twice. The corpus of data is then made by seventy-eight interviews. The present dissertation displays the results of the investigation of these interviews. It is divided into six main parts. Chapter one presents a theoretical framework about the two main issues: memory and text. The issue of the memory is introduced through many recherches drown up in the field of Cognitive Science and Neuroscience. It is developed, at the same time, a possible relationship with a semiotic approach. The theoretical debate about textuality, characterizing the field of Semiotics, is examined in the same chapter. Chapter two deals with methodology, showing the process of definition of the whole method used for production of the corpus of data. The interview is explored in detail: how it is born, what are the expected results, what are the main underlying hypothesis. In Chapter three the investigation of the answers given by the spectators starts. It is examined the phenomenon of the outstanding details of the process of remembering, trying to define them in a semiotic way. Moreover there is an investigation of the most remembered scenes in the movie. Chapter four considers how the spectators deal with the whole narrative. At the same time it is examined what they think about the global meaning of the film. Chapter five is about affects. It tries to define the role of emotions in the process of comprehension and remembering. Chapter six presents a study of how the spectators account for a single scene of the movie. The complete work offers a broad perspective about the semiotic issue of textuality, using both a semiotic competence and a cognitive one. At the same time it presents a new outlook on the issue of memory, opening several direction of research.
Resumo:
Background. Abdominal porto-systemic collaterals (APSC) on Color-Doppler ultrasound are a frequent finding in portal hypertensive cirrhotic patients. In patients with cirrhosis, an HVPG ≥ 16mmHg has been shown to be associated with increased mortality in two studies. Non-invasive indicators of HVPG ≥ 16 mmHg might define a subgroup of high-risk patients, but data on this aspect are lacking. Aims. We aimed to investigate whether HVPG predicts mortality in patients with clinically significant portal hypertension, and if APSC may predict a severe portal hypertensive state (i.e. HVPG≥16mmHg) in patients with cirrhosis and untreated portal hypertension. Methods. We analysed paired HVPG and ultrasonographic data of 86 untreated portal hypertensive cirrhotic patients. On abdominal echo-color-Doppler data on presence, type and number of APSC were prospectively collected. HVPG was measured following published guidelines. Clinical, laboratory and endoscopic data were available in all cases. First decompensation of cirrhosis and liver-disease related mortality on follow-up (mean 28±20 months) were recorded. Results. 73% of patients had compensated cirrhosis, while 27% were decompensated. All patients had an HVPG≥10 mmHg (mean 17.8±5.1 mmHg). 58% of compensated patients and 82% of decompensated patients had an HVPG over 16 mmHg. 25% had no varices, 28% had small varices, and 47% had medium/large varices. HVPG was higher in patients with esophageal varices vs. patients without varices (19.0±4.8 vs. 14.1±4.2mmHg, p<0.0001), and correlated with Child-Pugh score (R=0.494,p=0.019). 36 (42%) patients had APSC were more frequent in decompensated patients (60% vs. 35%, p=0.03) and in patients with esophageal varices (52% vs. 9%,p=0.001). HVPG was higher in patients with APSC compared with those without PSC (19.9± 4.6 vs. 16.2± 4.9mmHg, p=0.001). The prevalence of APSC was higher in patients with HVPG≥16mmHg vs. those with HVPG<16mmHg (57% vs. 13%,p<0.0001). Decompensation was significantly more frequent in patients with HVPG≥16mmHg vs. HVPG<16mmHg (35.1% vs. 11.5%, p=0.02). On multivariate analysis only HVPG and bilirubin were independent predictors of first decompensation. 10 patients died during follow-up. All had an HVPG≥16 mmHg (26% vs. 0% in patients with HVPG <16mmHg,p=0.04). On multivariate analysis only MELD score and HVPG ≥16mmHg were independent predictors of mortality. In compensated patients the detection of APSC predicted an HVPG≥16mmHg with 92% specificity, 54% sensitivity, positive and negative likelihood ratio 7.03 and 0.50, which implies that the demonstration of APSC on ultrasound increased the probability of HVPG≥16mmHg from 58% to 91%. Conclusions. HVPG maintains an independent prognostic value in the subset of patients with cirrhosis and clinically significant portal hypertension. The presence of APSC is a specific indicator of severe portal hypertension in patients with cirrhosis. Detection of APSC on ultrasound allows the non-invasive identification of a subgroup of compensated patients with bad prognosis, avoiding the invasive measurement of HVPG.
Resumo:
Recently in most of the industrial automation process an ever increasing degree of automation has been observed. This increasing is motivated by the higher requirement of systems with great performance in terms of quality of products/services generated, productivity, efficiency and low costs in the design, realization and maintenance. This trend in the growth of complex automation systems is rapidly spreading over automated manufacturing systems (AMS), where the integration of the mechanical and electronic technology, typical of the Mechatronics, is merging with other technologies such as Informatics and the communication networks. An AMS is a very complex system that can be thought constituted by a set of flexible working stations, one or more transportation systems. To understand how this machine are important in our society let considerate that every day most of us use bottles of water or soda, buy product in box like food or cigarets and so on. Another important consideration from its complexity derive from the fact that the the consortium of machine producers has estimated around 350 types of manufacturing machine. A large number of manufacturing machine industry are presented in Italy and notably packaging machine industry,in particular a great concentration of this kind of industry is located in Bologna area; for this reason the Bologna area is called “packaging valley”. Usually, the various parts of the AMS interact among them in a concurrent and asynchronous way, and coordinate the parts of the machine to obtain a desiderated overall behaviour is an hard task. Often, this is the case in large scale systems, organized in a modular and distributed manner. Even if the success of a modern AMS from a functional and behavioural point of view is still to attribute to the design choices operated in the definition of the mechanical structure and electrical electronic architecture, the system that governs the control of the plant is becoming crucial, because of the large number of duties associated to it. Apart from the activity inherent to the automation of themachine cycles, the supervisory system is called to perform other main functions such as: emulating the behaviour of traditional mechanical members thus allowing a drastic constructive simplification of the machine and a crucial functional flexibility; dynamically adapting the control strategies according to the different productive needs and to the different operational scenarios; obtaining a high quality of the final product through the verification of the correctness of the processing; addressing the operator devoted to themachine to promptly and carefully take the actions devoted to establish or restore the optimal operating conditions; managing in real time information on diagnostics, as a support of the maintenance operations of the machine. The kind of facilities that designers can directly find on themarket, in terms of software component libraries provides in fact an adequate support as regard the implementation of either top-level or bottom-level functionalities, typically pertaining to the domains of user-friendly HMIs, closed-loop regulation and motion control, fieldbus-based interconnection of remote smart devices. What is still lacking is a reference framework comprising a comprehensive set of highly reusable logic control components that, focussing on the cross-cutting functionalities characterizing the automation domain, may help the designers in the process of modelling and structuring their applications according to the specific needs. Historically, the design and verification process for complex automated industrial systems is performed in empirical way, without a clear distinction between functional and technological-implementation concepts and without a systematic method to organically deal with the complete system. Traditionally, in the field of analog and digital control design and verification through formal and simulation tools have been adopted since a long time ago, at least for multivariable and/or nonlinear controllers for complex time-driven dynamics as in the fields of vehicles, aircrafts, robots, electric drives and complex power electronics equipments. Moving to the field of logic control, typical for industrial manufacturing automation, the design and verification process is approached in a completely different way, usually very “unstructured”. No clear distinction between functions and implementations, between functional architectures and technological architectures and platforms is considered. Probably this difference is due to the different “dynamical framework”of logic control with respect to analog/digital control. As a matter of facts, in logic control discrete-events dynamics replace time-driven dynamics; hence most of the formal and mathematical tools of analog/digital control cannot be directly migrated to logic control to enlighten the distinction between functions and implementations. In addition, in the common view of application technicians, logic control design is strictly connected to the adopted implementation technology (relays in the past, software nowadays), leading again to a deep confusion among functional view and technological view. In Industrial automation software engineering, concepts as modularity, encapsulation, composability and reusability are strongly emphasized and profitably realized in the so-calledobject-oriented methodologies. Industrial automation is receiving lately this approach, as testified by some IEC standards IEC 611313, IEC 61499 which have been considered in commercial products only recently. On the other hand, in the scientific and technical literature many contributions have been already proposed to establish a suitable modelling framework for industrial automation. During last years it was possible to note a considerable growth in the exploitation of innovative concepts and technologies from ICT world in industrial automation systems. For what concerns the logic control design, Model Based Design (MBD) is being imported in industrial automation from software engineering field. Another key-point in industrial automated systems is the growth of requirements in terms of availability, reliability and safety for technological systems. In other words, the control system should not only deal with the nominal behaviour, but should also deal with other important duties, such as diagnosis and faults isolations, recovery and safety management. Indeed, together with high performance, in complex systems fault occurrences increase. This is a consequence of the fact that, as it typically occurs in reliable mechatronic systems, in complex systems such as AMS, together with reliable mechanical elements, an increasing number of electronic devices are also present, that are more vulnerable by their own nature. The diagnosis problem and the faults isolation in a generic dynamical system consists in the design of an elaboration unit that, appropriately processing the inputs and outputs of the dynamical system, is also capable of detecting incipient faults on the plant devices, reconfiguring the control system so as to guarantee satisfactory performance. The designer should be able to formally verify the product, certifying that, in its final implementation, it will perform itsrequired function guarantying the desired level of reliability and safety; the next step is that of preventing faults and eventually reconfiguring the control system so that faults are tolerated. On this topic an important improvement to formal verification of logic control, fault diagnosis and fault tolerant control results derive from Discrete Event Systems theory. The aimof this work is to define a design pattern and a control architecture to help the designer of control logic in industrial automated systems. The work starts with a brief discussion on main characteristics and description of industrial automated systems on Chapter 1. In Chapter 2 a survey on the state of the software engineering paradigm applied to industrial automation is discussed. Chapter 3 presentes a architecture for industrial automated systems based on the new concept of Generalized Actuator showing its benefits, while in Chapter 4 this architecture is refined using a novel entity, the Generalized Device in order to have a better reusability and modularity of the control logic. In Chapter 5 a new approach will be present based on Discrete Event Systems for the problemof software formal verification and an active fault tolerant control architecture using online diagnostic. Finally conclusive remarks and some ideas on new directions to explore are given. In Appendix A are briefly reported some concepts and results about Discrete Event Systems which should help the reader in understanding some crucial points in chapter 5; while in Appendix B an overview on the experimental testbed of the Laboratory of Automation of University of Bologna, is reported to validated the approach presented in chapter 3, chapter 4 and chapter 5. In Appendix C some components model used in chapter 5 for formal verification are reported.
Resumo:
The PhD project was focused on the study of the poultry welfare conditions and improvements. The project work was divided into 3 main research activities. A) Field evaluation of chicken meat rearing conditions kept in intensive farms. Considering the lack of published reports concerning the overall Italian rearing conditions of broiler chickens, a survey was carried out to assess the welfare conditions of broiler reared in the most important poultry companies in Italy to verify if they are in accordance with the advices given in the European proposal COM (2005) 221 final. Chicken farm conditions, carcass lesions and meat quality were investigated. 1. The densities currently used in Italy are in accordance with the European proposal COM 221 final (2005) which suggests to keep broilers at a density lower than 30-32 kg live weight/m2 and to not exceed 38-40 kg live weight/m2. 2. The mortality rates in summer and winter agree with the mortality score calculated following the formula reported in the EU Proposal COM 221 final (2005). 3. The incidence of damaged carcasses was very low and did not seem related to the stocking density. 4. The FPD scores were generally above the maximum limit advised by the EU proposal COM 221 final (2005), although the stocking densities were lower than 30-32 kg live weight per m2. 5. It can be stated that the control of the environmental conditions, particularly litter quality, appears a key issue to control the onset of foot dermatitis. B) Manipulation of several farm parameters, such litter material and depth, stocking density and light regimen to improve the chicken welfare conditions, in winter season. 1. Even though 2 different stocking densities were established in this study, the performances achieved from the chickens were almost identical among groups. 2. The FCR was significantly better in Standard conditions contrarily to birds reared in Welfare conditions with lower stocking density, more litter material and with a light program of 16 hours light and 8 hours dark. 3. In our trial, in Standard groups we observed a higher content of moisture, nitrogen and ammonia released from the litter. Therefore it can be assumed that the environmental characteristics have been positively changed by the improvements of the rearing conditions adopted for Welfare groups. 4. In Welfare groups the exhausted litters of the pens were dryer and broilers showed a lower occurrence of FPD. 5. The prevalence of hock burn lesions, like FPD, is high with poor litter quality conditions. 6. The combined effect of a lower stocking density, a greater amount of litter material and a photoperiod similar to the natural one, have positively influenced the chickens welfare status, as a matter of fact the occurrence of FPD in Welfare groups was the lowest keeping the score under the European threshold of the proposal COM 221 final(2005). C) The purpose of the third research was to study the effect of high or low stocking density of broiler chickens, different types of litter and the adoption of short or long lighting regimen on broiler welfare through the evaluation of their productivity and incidence of foot pad dermatitis during the hot season. 1. The feed efficiency was better for the Low Density than for High Density broilers. 2. The appearance of FPD was not influenced by stocking density. 3. The foot examination revealed that the lesions occurred more in birds maintained on chopped wheat straw than on wood shaving. 4. In conclusion, the adoptions of a short light regimen similar to that occurring in nature during summer reduces the feed intake without modify the growth rate thus improving the feed efficiency. Foot pad lesion were not affected neither by stocking densities nor by light regimens whereas wood shavings exerted a favourable effect in preserving foot pad in good condition. D) A study was carried out to investigate more widely the possible role of 25-hydroxycholecalciferol supplemented in the diet of a laying hen commercial strain (Lohmann brown) in comparison of diets supplemented with D3 or with D3 + 25- hydroxycholecalciferol. Egg traits during a productive cycle as well as the bone characteristics of the layers have been as well evaluated to determine if there the vitamin D3 may enhance the welfare status of the birds. 1. The weight of the egg and of its components is often greater in hens fed a diet enriched with 25-hydroxycholecalciferol. 2. Since eggs of treated groups are heavier and a larger amount of shell is needed, a direct effect on shell strength is observed. 3. At 30 and at 50 wk of age hens fed 25 hydroxycholecalciferol exhibited greater values of bone breaking force. 4. Radiographic density values obtained in the trial are always higher in hens fed with 25-hydroxycholecalciferol of both treatments: supplemented for the whole laying cycle (25D3) or from 40 weeks of age onward (D3+25D3).
Resumo:
Different tools have been used to set up and adopt the model for the fulfillment of the objective of this research. 1. The Model The base model that has been used is the Analytical Hierarchy Process (AHP) adapted with the aim to perform a Benefit Cost Analysis. The AHP developed by Thomas Saaty is a multicriteria decision - making technique which decomposes a complex problem into a hierarchy. It is used to derive ratio scales from both discreet and continuous paired comparisons in multilevel hierarchic structures. These comparisons may be taken from actual measurements or from a fundamental scale that reflects the relative strength of preferences and feelings. 2. Tools and methods 2.1. The Expert Choice Software The software Expert Choice is a tool that allows each operator to easily implement the AHP model in every stage of the problem. 2.2. Personal Interviews to the farms For this research, the farms of the region Emilia Romagna certified EMAS have been detected. Information has been given by EMAS center in Wien. Personal interviews have been carried out to each farm in order to have a complete and realistic judgment of each criteria of the hierarchy. 2.3. Questionnaire A supporting questionnaire has also been delivered and used for the interviews . 3. Elaboration of the data After data collection, the data elaboration has taken place. The software support Expert Choice has been used . 4. Results of the Analysis The result of the figures above (vedere altro documento) gives a series of numbers which are fractions of the unit. This has to be interpreted as the relative contribution of each element to the fulfillment of the relative objective. So calculating the Benefits/costs ratio for each alternative the following will be obtained: Alternative One: Implement EMAS Benefits ratio: 0, 877 Costs ratio: 0, 815 Benfit/Cost ratio: 0,877/0,815=1,08 Alternative Two: Not Implement EMAS Benefits ratio: 0,123 Costs ration: 0,185 Benefit/Cost ratio: 0,123/0,185=0,66 As stated above, the alternative with the highest ratio will be the best solution for the organization. This means that the research carried out and the model implemented suggests that EMAS adoption in the agricultural sector is the best alternative. It has to be noted that the ratio is 1,08 which is a relatively low positive value. This shows the fragility of this conclusion and suggests a careful exam of the benefits and costs for each farm before adopting the scheme. On the other part, the result needs to be taken in consideration by the policy makers in order to enhance their intervention regarding the scheme adoption on the agricultural sector. According to the AHP elaboration of judgments we have the following main considerations on Benefits: - Legal compliance seems to be the most important benefit for the agricultural sector since its rank is 0,471 - The next two most important benefits are Improved internal organization (ranking 0,230) followed by Competitive advantage (ranking 0, 221) mostly due to the sub-element Improved image (ranking 0,743) Finally, even though Incentives are not ranked among the most important elements, the financial ones seem to have been decisive on the decision making process. According to the AHP elaboration of judgments we have the following main considerations on Costs: - External costs seem to be largely more important than the internal ones (ranking 0, 857 over 0,143) suggesting that Emas costs over consultancy and verification remain the biggest obstacle. - The implementation of the EMS is the most challenging element regarding the internal costs (ranking 0,750).
Resumo:
Researches performed during the PhD course intended to assess innovative applications of near-infrared spectroscopy in reflectance (NIR) in the production chain of beer. The purpose is to measure by NIR the "malting quality" (MQ) parameter of barley, to monitor the malting process and to know if a certain type of barley is suitable for the production of beer and spirits. Moreover, NIR will be applied to monitor the brewing process. First of all, it was possible to check the quality of the raw materials like barley, maize and barley malt using a rapid, non-destructive and reliable method, with a low error of prediction. The more interesting result obtained at this level was that the repeatability of the NIR calibration models developed was comparable with the one of the reference method. Moreover, about malt, new kinds of validation were used in order to estimate the real predictive power of the proposed calibration models and to understand the long-term effects. Furthermore, the precision of all the calibration models developed for malt evaluation was estimated and statistically compared with the reference methods, with good results. Then, new calibration models were developed for monitoring the malting process, measuring the moisture content and other malt quality parameters during germination. Moreover it was possible to obtain by NIR an estimate of the "malting quality" (MQ) of barley and to predict whether if its germination will be rapid and uniform and if a certain type of barley is suitable for the production of beer and spirits. Finally, the NIR technique was applied to monitor the brewing process, using correlations between NIR spectra of beer and analytical parameters, and to assess beer quality. These innovative results are potentially very useful for the actors involved in the beer production chain, especially the calibration models suitable for the control of the malting process and for the assessment of the “malting quality” of barley, which need to be deepened in future studies.
Resumo:
Nowadays the development of sustainable polymers, with convenient properties to substitute the traditional petroleum-based materials, is one of the major issues for material science. The utilization of renewable resources as feedstock for biopolyesters is a challenging target.The research work described in the present thesis is strictly connected to these urgent necessities and is focused mainly in finding new biopolymers, in particular biopolyesters, which are obtainable from biomass and characterized by a wide range of properties, in order to potentially substitute polyolefins and aromatic polyesters (for example, poly(ethylene terephthalate))
Resumo:
This research deals with the deepening and use of an environmental accounting matrix in Emilia-Romagna, RAMEA air emissions (regional NAMEA), carried out by the Regional Environment Agency (Arpa) in an European project. After a depiction of the international context regarding the widespread needing to integrate economic indicators and go beyond conventional reporting system, this study explains the structure, update and development of the tool. The overall aim is to outline the matrix for environmental assessments of regional plans, draw up sustainable reports and monitor effects of regional policies in a sustainable development perspective. The work focused on an application of a Shift-Share model, on the integration with eco-taxes, industrial waste production, energy consumptions, on applications of the extended RAMEA as a policy tool, following Eurostat guidelines. The common thread is the eco-efficiency (economic-environmental efficiency) index. The first part, in English, treats the methodology used to build a more complete tool; in the second part RAMEA has been applied on two regional case studies, in Italian, to support decision makers regarding Strategic Environmental Assessments’ processes (2001/42/EC). The aim is to support an evidence-based policy making by integrating sustainable development concerns at all levels. The first case study regards integrated environmental-economic analyses in support to the SEA of the Regional Waste management plan. For the industrial waste production an extended and updated RAMEA has been developed as a useful policy tool, to help in analysing and monitoring the state of environmental-economic performances. The second case study deals with the environmental report for the SEA of the Regional Program concerning productive activities. RAMEA has been applied aiming to an integrated environmental-economic analysis of the context, to investigate the performances of the regional production chains and to depict and monitor the area where the program should be carried out, from an integrated environmental-economic perspective.
Resumo:
La società dei consumi ha reso ogni ambito vitale suscettibile di mercificazione. Il capitalismo ha da tempo svestito la prassi produttiva dei suoi elementi più pesanti per vestire i panni della dimensione simbolica e culturale. Il consumo fattosi segno, dimensione immateriale delle nostre vite materiali, ha finito per colonizzare l'ambito dell'esperienza e, quindi, della vita stessa. Il consumo diventa, innanzitutto, esperienza di vita. Una esperienza continuamente cangiante che ci permette di vivere numerose vite, ognuna diversa. Ciò che è stabile è il paradigma consumistico che investe la nostra stessa identità, l'identità dei luoghi, l'identità del mondo. Il nomadismo è la dimensione più tipica del consumo, così come la perenne mobilità della vita è la dimensione propria dell'epoca globale. Tuttavia, le nuove forme di consumerismo politico, etico e responsabile, conseguenti al montare dei rischi globali, investendo proprio l’ambito dell’esperienza e del consumo, contribuiscono a modificare i comportamenti in senso “riflessivo” e “glocale”. L’ambito turistico, rappresentando al contempo il paradigma esperienziale e quello della mobilità globale, può diventare allora l’osservatorio privilegiato per indagare queste stesse forme riflessive di consumo, le quali forniscono un significato del tutto nuovo tanto all’esperienza quanto al consumo stesso. Il lavoro di tesi vuole allora approfondire, attraverso lo studio di caso, il modo in cui nuove forme emergenti di turismo responsabile possano rappresentare una chiave d’accesso a nuove forme di sviluppo sostenibile dei territori locali in contesti di prima modernizzazione.
Resumo:
Questo lavoro si è occupato della ricerca e progettazione di un'antenna UWB per la realizzazione di un tag RFID e si colloca all'interno del progetto GRETA (GREen TAgs), finanziato dal MIUR. Le principali caratteristiche richieste al green tag sono: dimensioni complessive di massimo 4-5 cm, assenza di batterie e compatibilità con l'ambiente. L'eco-compatibilità viene garantita tramite la realizzazione dell'antenna al di sopra di un substrato di carta; i limiti derivanti dall'assenza di batterie vengono invece sopperiti tramite realizzazione di energy harvesting, al fine di raggiungere una completa autonomia energetica. Viene sfruttata la tecnica UWB per la comunicazione nella banda (3.1-4.8 GHz); l'energy harvesting si effettua invece a 868 MHz. Sono infine stati ricavati alcuni primi risultati relativi alla potenza rettificabile con la soluzione proposta, tramite realizzazione di un opportuno circuito rettificatore.