23 resultados para Ophthalmic Optics and Devices
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
Reliable electronic systems, namely a set of reliable electronic devices connected to each other and working correctly together for the same functionality, represent an essential ingredient for the large-scale commercial implementation of any technological advancement. Microelectronics technologies and new powerful integrated circuits provide noticeable improvements in performance and cost-effectiveness, and allow introducing electronic systems in increasingly diversified contexts. On the other hand, opening of new fields of application leads to new, unexplored reliability issues. The development of semiconductor device and electrical models (such as the well known SPICE models) able to describe the electrical behavior of devices and circuits, is a useful means to simulate and analyze the functionality of new electronic architectures and new technologies. Moreover, it represents an effective way to point out the reliability issues due to the employment of advanced electronic systems in new application contexts. In this thesis modeling and design of both advanced reliable circuits for general-purpose applications and devices for energy efficiency are considered. More in details, the following activities have been carried out: first, reliability issues in terms of security of standard communication protocols in wireless sensor networks are discussed. A new communication protocol is introduced, allows increasing the network security. Second, a novel scheme for the on-die measurement of either clock jitter or process parameter variations is proposed. The developed scheme can be used for an evaluation of both jitter and process parameter variations at low costs. Then, reliability issues in the field of “energy scavenging systems” have been analyzed. An accurate analysis and modeling of the effects of faults affecting circuit for energy harvesting from mechanical vibrations is performed. Finally, the problem of modeling the electrical and thermal behavior of photovoltaic (PV) cells under hot-spot condition is addressed with the development of an electrical and thermal model.
Resumo:
The discovery of the Cosmic Microwave Background (CMB) radiation in 1965 is one of the fundamental milestones supporting the Big Bang theory. The CMB is one of the most important source of information in cosmology. The excellent accuracy of the recent CMB data of WMAP and Planck satellites confirmed the validity of the standard cosmological model and set a new challenge for the data analysis processes and their interpretation. In this thesis we deal with several aspects and useful tools of the data analysis. We focus on their optimization in order to have a complete exploitation of the Planck data and contribute to the final published results. The issues investigated are: the change of coordinates of CMB maps using the HEALPix package, the problem of the aliasing effect in the generation of low resolution maps, the comparison of the Angular Power Spectrum (APS) extraction performances of the optimal QML method, implemented in the code called BolPol, and the pseudo-Cl method, implemented in Cromaster. The QML method has been then applied to the Planck data at large angular scales to extract the CMB APS. The same method has been applied also to analyze the TT parity and the Low Variance anomalies in the Planck maps, showing a consistent deviation from the standard cosmological model, the possible origins for this results have been discussed. The Cromaster code instead has been applied to the 408 MHz and 1.42 GHz surveys focusing on the analysis of the APS of selected regions of the synchrotron emission. The new generation of CMB experiments will be dedicated to polarization measurements, for which are necessary high accuracy devices for separating the polarizations. Here a new technology, called Photonic Crystals, is exploited to develop a new polarization splitter device and its performances are compared to the devices used nowadays.
Resumo:
The role of non-neuronal brain cells, called astrocytes, is emerging as crucial in brain function and dysfunction, encompassing the neurocentric concept that was envisioning glia as passive components. Ion and water channels and calcium signalling, expressed in functional micro and nano domains, underpin astrocytes’ homeostatic function, synaptic transmission, neurovascular coupling acting either locally and globally. In this respect, a major issue arises on the mechanism through which astrocytes can control processes across scales. Finally, astrocytes can sense and react to extracellular stimuli such as chemical, physical, mechanical, electrical, photonic ones at the nanoscale. Given their emerging importance and their sensing properties, my PhD research program had the general goal to validate nanomaterials, interfaces and devices approaches that were developed ad-hoc to study astrocytes. The results achieved are reported in the form of collection of papers. Specifically, we demonstrated that i) electrospun nanofibers made of polycaprolactone and polyaniline conductive composites can shape primary astrocytes’ morphology, without affecting their function ii) gold coated silicon nanowires devices enable extracellular recording of unprecedented slow wave in primary differentiated astrocytes iii) colloidal hydrotalcites films allow to get insight in cell volume regulation process in differentiated astrocytes and to describe novel cytoskeletal actin dynamics iv) gold nanoclusters represent nanoprobe to trigger astrocytes structure and function v) nanopillars of photoexcitable organic polymer are potential tool to achieve nanoscale photostimulation of astrocytes. The results were achieved by a multidisciplinary team working with national and international collaborators that are listed and acknowledged in the text. Collectively, the results showed that astrocytes represent a novel opportunity and target for Nanoscience, and that Nanoglial interface might help to unveil clues on brain function or represent novel therapeutic approach to treat brain dysfunctions.
Resumo:
Bioelectronic interfaces have significantly advanced in recent years, offering potential treatments for vision impairments, spinal cord injuries, and neurodegenerative diseases. However, the classical neurocentric vision drives the technological development toward neurons. Emerging evidence highlights the critical role of glial cells in the nervous system. Among them, astrocytes significantly influence neuronal networks throughout life and are implicated in several neuropathological states. Although they are incapable to fire action potentials, astrocytes communicate through diverse calcium (Ca2+) signalling pathways, crucial for cognitive functions and brain blood flow regulation. Current bioelectronic devices are primarily designed to interface neurons and are unsuitable for studying astrocytes. Graphene, with its unique electrical, mechanical and biocompatibility properties, has emerged as a promising neural interface material. However, its use as electrode interface to modulate astrocyte functionality remains unexplored. The aim of this PhD work was to exploit Graphene-oxide (GO) and reduced GO (rGO)-coated electrodes to control Ca2+ signalling in astrocytes by electrical stimulation. We discovered that distinct Ca2+dynamics in astrocytes can be evoked, in vitro and in brain slices, depending on the conductive/insulating properties of rGO/GO electrodes. Stimulation by rGO electrodes induces intracellular Ca2+ response with sharp peaks of oscillations (“P-type”), exclusively due to Ca2+ release from intracellular stores. Conversely, astrocytes stimulated by GO electrodes show slower and sustained Ca2+ response (“S-type”), largely mediated by external Ca2+ influx through specific ion channels. Astrocytes respond faster than neurons and activate distinct G-Protein Coupled Receptor intracellular signalling pathways. We propose a resistive/insulating model, hypothesizing that the different conductivity of the substrate influences the electric field at the cell/electrolyte or cell/material interfaces, favouring, respectively, the Ca2+ release from intracellular stores or the extracellular Ca2+ influx. This research provides a simple tool to selectively control distinct Ca2+ signals in brain astrocytes in neuroscience and bioelectronic medicine.
Resumo:
Most of current ultra-miniaturized devices are obtained by the top-down approach, in which nanoscale components are fabricated by cutting down larger precursors. Since this physical-engineering method is reaching its limits, especially for components below 30 nm in size, alternative strategies are necessary. Of particular appeal to chemists is the supramolecular bottom-up approach to nanotechnology, a methodology that utilizes the principles of molecular recognition to build materials and devices from molecular components. The subject of this thesis is the photophysical and electrochemical investigation of nanodevices obtained harnessing the principles of supramolecular chemistry. These systems operate in solution-based environments and are investigated at the ensemble level. The majority of the chemical systems discussed here are based on pseudorotaxanes and catenanes. Such supramolecular systems represent prototypes of molecular machines since they are capable of performing simple controlled mechanical movements. Their properties and operation are strictly related to the supramolecular interactions between molecular components (generally photoactive or electroactive molecules) and to the possibility of modulating such interactions by means of external stimuli. The main issues addressed throughout the thesis are: (i) the analysis of the factors that can affect the architecture and perturb the stability of supramolecular systems; (ii) the possibility of controlling the direction of supramolecular motions exploiting the molecular information content; (iii) the development of switchable supramolecular polymers starting from simple host-guest complexes; (iv) the capability of some molecular machines to process information at molecular level, thus behaving as logic devices; (v) the behaviour of molecular machine components in a biological-type environment; (vi) the study of chemically functionalized metal nanoparticles by second harmonic generation spectroscopy.
Resumo:
The present research thesis was focused on the development of new biomaterials and devices for application in regenerative medicine, particularly in the repair/regeneration of bone and osteochondral regions affected by degenerative diseases such as Osteoarthritis and Osteoporosis or serious traumas. More specifically, the work was focused on the synthesis and physico-chemical-morphological characterization of: i) a new superparamagnetic apatite phase; ii) new biomimetic superparamagnetic bone and osteochondral scaffolds; iii) new bioactive bone cements for regenerative vertebroplasty. The new bio-devices were designed to exhibit high biomimicry with hard human tissues and with functionality promoting faster tissue repair and improved texturing. In particular, recent trends in tissue regeneration indicate magnetism as a new tool to stimulate cells towards tissue formation and organization; in this perspective a new superparamagnetic apatite was synthesized by doping apatite lattice with di-and trivalent iron ions during synthesis. This finding was the pin to synthesize newly conceived superparamagnetic bone and osteochondral scaffolds by reproducing in laboratory the biological processes yielding the formation of new bone, i.e. the self-assembly/organization of collagen fibrils and heterogeneous nucleation of nanosized, ionically substituted apatite mimicking the mineral part of bone. The new scaffolds can be magnetically switched on/off and function as workstations guiding fast tissue regeneration by minimally invasive and more efficient approaches. Moreover, in the view of specific treatments for patients affected by osteoporosis or traumas involving vertebrae weakening or fracture, the present work was also dedicated to the development of new self-setting injectable pastes based on strontium-substituted calcium phosphates, able to harden in vivo and transform into strontium-substituted hydroxyapatite. The addition of strontium may provide an anti-osteoporotic effect, aiding to restore the physiologic bone turnover. The ceramic-based paste was also added with bio-polymers, able to be progressively resorbed thus creating additional porosity in the cement body that favour cell colonization and osseointegration.
Resumo:
During the last decade peach and nectarine fruit have lost considerable market share, due to increased consumer dissatisfaction with quality at retail markets. This is mainly due to harvesting of too immature fruit and high ripening heterogeneity. The main problem is that the traditional used maturity indexes are not able to objectively detect fruit maturity stage, neither the variability present in the field, leading to a difficult post-harvest management of the product and to high fruit losses. To assess more precisely the fruit ripening other techniques and devices can be used. Recently, a new non-destructive maturity index, based on the vis-NIR technology, the Index of Absorbance Difference (IAD), that correlates with fruit degreening and ethylene production, was introduced and the IAD was used to study peach and nectarine fruit ripening from the “field to the fork”. In order to choose the best techniques to improve fruit quality, a detailed description of the tree structure, of fruit distribution and ripening evolution on the tree was faced. More in details, an architectural model (PlantToon®) was used to design the tree structure and the IAD was applied to characterize the maturity stage of each fruit. Their combined use provided an objective and precise evaluation of the fruit ripening variability, related to different training systems, crop load, fruit exposure and internal temperature. Based on simple field assessment of fruit maturity (as IAD) and growth, a model for an early prediction of harvest date and yield, was developed and validated. The relationship between the non-destructive maturity IAD, and the fruit shelf-life, was also confirmed. Finally the obtained results were validated by consumer test: the fruit sorted in different maturity classes obtained a different consumer acceptance. The improved knowledge, leaded to an innovative management of peach and nectarine fruit, from “field to market”.
Resumo:
The application of modern ICT technologies is radically changing many fields pushing toward more open and dynamic value chains fostering the cooperation and integration of many connected partners, sensors, and devices. As a valuable example, the emerging Smart Tourism field derived from the application of ICT to Tourism so to create richer and more integrated experiences, making them more accessible and sustainable. From a technological viewpoint, a recurring challenge in these decentralized environments is the integration of heterogeneous services and data spanning multiple administrative domains, each possibly applying different security/privacy policies, device and process control mechanisms, service access, and provisioning schemes, etc. The distribution and heterogeneity of those sources exacerbate the complexity in the development of integrating solutions with consequent high effort and costs for partners seeking them. Taking a step towards addressing these issues, we propose APERTO, a decentralized and distributed architecture that aims at facilitating the blending of data and services. At its core, APERTO relies on APERTO FaaS, a Serverless platform allowing fast prototyping of the business logic, lowering the barrier of entry and development costs to newcomers, (zero) fine-grained scaling of resources servicing end-users, and reduced management overhead. APERTO FaaS infrastructure is based on asynchronous and transparent communications between the components of the architecture, allowing the development of optimized solutions that exploit the peculiarities of distributed and heterogeneous environments. In particular, APERTO addresses the provisioning of scalable and cost-efficient mechanisms targeting: i) function composition allowing the definition of complex workloads from simple, ready-to-use functions, enabling smarter management of complex tasks and improved multiplexing capabilities; ii) the creation of end-to-end differentiated QoS slices minimizing interfaces among application/service running on a shared infrastructure; i) an abstraction providing uniform and optimized access to heterogeneous data sources, iv) a decentralized approach for the verification of access rights to resources.
Resumo:
The dynamicity and heterogeneity that characterize pervasive environments raise new challenges in the design of mobile middleware. Pervasive environments are characterized by a significant degree of heterogeneity, variability, and dynamicity that conventional middleware solutions are not able to adequately manage. Originally designed for use in a relatively static context, such middleware systems tend to hide low-level details to provide applications with a transparent view on the underlying execution platform. In mobile environments, however, the context is extremely dynamic and cannot be managed by a priori assumptions. Novel middleware should therefore support mobile computing applications in the task of adapting their behavior to frequent changes in the execution context, that is, it should become context-aware. In particular, this thesis has identified the following key requirements for novel context-aware middleware that existing solutions do not fulfil yet. (i) Middleware solutions should support interoperability between possibly unknown entities by providing expressive representation models that allow to describe interacting entities, their operating conditions and the surrounding world, i.e., their context, according to an unambiguous semantics. (ii) Middleware solutions should support distributed applications in the task of reconfiguring and adapting their behavior/results to ongoing context changes. (iii) Context-aware middleware support should be deployed on heterogeneous devices under variable operating conditions, such as different user needs, application requirements, available connectivity and device computational capabilities, as well as changing environmental conditions. Our main claim is that the adoption of semantic metadata to represent context information and context-dependent adaptation strategies allows to build context-aware middleware suitable for all dynamically available portable devices. Semantic metadata provide powerful knowledge representation means to model even complex context information, and allow to perform automated reasoning to infer additional and/or more complex knowledge from available context data. In addition, we suggest that, by adopting proper configuration and deployment strategies, semantic support features can be provided to differentiated users and devices according to their specific needs and current context. This thesis has investigated novel design guidelines and implementation options for semantic-based context-aware middleware solutions targeted to pervasive environments. These guidelines have been applied to different application areas within pervasive computing that would particularly benefit from the exploitation of context. Common to all applications is the key role of context in enabling mobile users to personalize applications based on their needs and current situation. The main contributions of this thesis are (i) the definition of a metadata model to represent and reason about context, (ii) the definition of a model for the design and development of context-aware middleware based on semantic metadata, (iii) the design of three novel middleware architectures and the development of a prototypal implementation for each of these architectures, and (iv) the proposal of a viable approach to portability issues raised by the adoption of semantic support services in pervasive applications.
Resumo:
Two of the main features of today complex software systems like pervasive computing systems and Internet-based applications are distribution and openness. Distribution revolves around three orthogonal dimensions: (i) distribution of control|systems are characterised by several independent computational entities and devices, each representing an autonomous and proactive locus of control; (ii) spatial distribution|entities and devices are physically distributed and connected in a global (such as the Internet) or local network; and (iii) temporal distribution|interacting system components come and go over time, and are not required to be available for interaction at the same time. Openness deals with the heterogeneity and dynamism of system components: complex computational systems are open to the integration of diverse components, heterogeneous in terms of architecture and technology, and are dynamic since they allow components to be updated, added, or removed while the system is running. The engineering of open and distributed computational systems mandates for the adoption of a software infrastructure whose underlying model and technology could provide the required level of uncoupling among system components. This is the main motivation behind current research trends in the area of coordination middleware to exploit tuple-based coordination models in the engineering of complex software systems, since they intrinsically provide coordinated components with communication uncoupling and further details in the references therein. An additional daunting challenge for tuple-based models comes from knowledge-intensive application scenarios, namely, scenarios where most of the activities are based on knowledge in some form|and where knowledge becomes the prominent means by which systems get coordinated. Handling knowledge in tuple-based systems induces problems in terms of syntax - e.g., two tuples containing the same data may not match due to differences in the tuple structure - and (mostly) of semantics|e.g., two tuples representing the same information may not match based on a dierent syntax adopted. Till now, the problem has been faced by exploiting tuple-based coordination within a middleware for knowledge intensive environments: e.g., experiments with tuple-based coordination within a Semantic Web middleware (surveys analogous approaches). However, they appear to be designed to tackle the design of coordination for specic application contexts like Semantic Web and Semantic Web Services, and they result in a rather involved extension of the tuple space model. The main goal of this thesis was to conceive a more general approach to semantic coordination. In particular, it was developed the model and technology of semantic tuple centres. It is adopted the tuple centre model as main coordination abstraction to manage system interactions. A tuple centre can be seen as a programmable tuple space, i.e. an extension of a Linda tuple space, where the behaviour of the tuple space can be programmed so as to react to interaction events. By encapsulating coordination laws within coordination media, tuple centres promote coordination uncoupling among coordinated components. Then, the tuple centre model was semantically enriched: a main design choice in this work was to try not to completely redesign the existing syntactic tuple space model, but rather provide a smooth extension that { although supporting semantic reasoning { keep the simplicity of tuple and tuple matching as easier as possible. By encapsulating the semantic representation of the domain of discourse within coordination media, semantic tuple centres promote semantic uncoupling among coordinated components. The main contributions of the thesis are: (i) the design of the semantic tuple centre model; (ii) the implementation and evaluation of the model based on an existent coordination infrastructure; (iii) a view of the application scenarios in which semantic tuple centres seem to be suitable as coordination media.
Resumo:
The hard X-ray band (10 - 100 keV) has been only observed so far by collimated and coded aperture mask instruments, with a sensitivity and an angular resolution lower than two orders of magnitude as respects the current X-ray focusing telescopes operating below 10 - 15 keV. The technological advance in X-ray mirrors and detection systems is now able to extend the X-ray focusing technique to the hard X-ray domain, filling the gap in terms of observational performances and providing a totally new deep view on some of the most energetic phenomena of the Universe. In order to reach a sensitivity of 1 muCrab in the 10 - 40 keV energy range, a great care in the background minimization is required, a common issue for all the hard X-ray focusing telescopes. In the present PhD thesis, a comprehensive analysis of the space radiation environment, the payload design and the resulting prompt X-ray background level is presented, with the aim of driving the feasibility study of the shielding system and assessing the scientific requirements of the future hard X-ray missions. A Geant4 based multi-mission background simulator, BoGEMMS, is developed to be applied to any high energy mission for which the shielding and instruments performances are required. It allows to interactively create a virtual model of the telescope and expose it to the space radiation environment, tracking the particles along their path and filtering the simulated background counts as a real observation in space. Its flexibility is exploited to evaluate the background spectra of the Simbol-X and NHXM mission, as well as the soft proton scattering by the X-ray optics and the selection of the best shielding configuration. Altough the Simbol-X and NHXM missions are the case studies of the background analysis, the obtained results can be generalized to any future hard X-ray telescope. For this reason, a simplified, ideal payload model is also used to select the major sources of background in LEO. All the results are original contributions to the assessment studies of the cited missions, as part of the background groups activities.
Resumo:
Pervasive Sensing is a recent research trend that aims at providing widespread computing and sensing capabilities to enable the creation of smart environments that can sense, process, and act by considering input coming from both people and devices. The capabilities necessary for Pervasive Sensing are nowadays available on a plethora of devices, from embedded devices to PCs and smartphones. The wide availability of new devices and the large amount of data they can access enable a wide range of novel services in different areas, spanning from simple data collection systems to socially-aware collaborative filtering. However, the strong heterogeneity and unreliability of devices and sensors poses significant challenges. So far, existing works on Pervasive Sensing have focused only on limited portions of the whole stack of available devices and data that they can use, to propose and develop mainly vertical solutions. The push from academia and industry for this kind of services shows that time is mature for a more general support framework for Pervasive Sensing solutions able to enhance frail architectures, promote a well balanced usage of resources on different devices, and enable the widest possible access to sensed data, while ensuring a minimal energy consumption on battery-operated devices. This thesis focuses on pervasive sensing systems to extract design guidelines as foundation of a comprehensive reference model for multi-tier Pervasive Sensing applications. The validity of the proposed model is tested in five different scenarios that present peculiar and different requirements, and different hardware and sensors. The ease of mapping from the proposed logical model to the real implementations and the positive performance result campaigns prove the quality of the proposed approach and offer a reliable reference model, together with a direction for the design and deployment of future Pervasive Sensing applications.
Resumo:
Most of the problems in modern structural design can be described with a set of equation; solutions of these mathematical models can lead the engineer and designer to get info during the design stage. The same holds true for physical-chemistry; this branch of chemistry uses mathematics and physics in order to explain real chemical phenomena. In this work two extremely different chemical processes will be studied; the dynamic of an artificial molecular motor and the generation and propagation of the nervous signals between excitable cells and tissues like neurons and axons. These two processes, in spite of their chemical and physical differences, can be both described successfully by partial differential equations, that are, respectively the Fokker-Planck equation and the Hodgkin and Huxley model. With the aid of an advanced engineering software these two processes have been modeled and simulated in order to extract a lot of physical informations about them and to predict a lot of properties that can be, in future, extremely useful during the design stage of both molecular motors and devices which rely their actions on the nervous communications between active fibres.
Resumo:
La dimensione giuridica del tempo, cioè il ruolo del tempo nel diritto, ha contribuito a far emergere il lavoro quale concetto legale astratto dal prestatore, funzionale alla sua alienazione. Rompere l’unitarietà del tempo della persona, per consacrarne una parte al lavoro, ha dato ai rapporti di potere tra gli individui una forma giuridica in grado di legittimarli. Il primo capitolo si sofferma sulla costruzione legale del lavoro subordinato attraverso la definizione del suo tempo. Tra gli elementi essenziali del contratto tipizzato dall’art. 2094 c.c., il tempo nella causa, oltre a mostrare la non istantaneità dello scambio e la indeterminatezza del regolamento contrattuale, non assurge a criterio discretivo dalle collaborazioni o dal lavoro autonomo. Se dalla causa si passa ad indagare l’oggetto, l’interprete si imbatte nel pudore di svelare quale sia il vero elemento su cui incide il contratto, la persona del lavoratore, che induce a prediligere la finzione di allontanare l’attività dal corpo di chi la produce: l’orario è la tecnica giuridica che rende possibile la partecipazione del lavoro ad una logica di scambio. Il secondo e il terzo capitolo si concentrano sulla regolazione e interpretazione del tempo di lavoro in chiave diacronica. La legislazione lavoristica ha trovato i propri albori nella disciplina eteronoma dell’orario, come strumento per tutelare i prestatori dagli eccessi mercantilistici. L’attuale quadro normativo è molto attento alle ragioni creditorie ma la giurisprudenza della Corte di giustizia testimonia l’irrequietudine di una materia viva e in movimento. Non è un caso che il legislatore europeo, nei piccoli passi compiuti per rafforzare il diritto sociale comune agli Stati membri, abbia assegnato al tempo un ruolo centrale, come prevedibilità delle condizioni di lavoro (direttiva n. 1152 del 2019), come equilibrio tra attività professionale e vita familiare (direttiva n. 1158 del 2019) e un domani, forse, come diritto alla disconnessione.
Resumo:
Questa ricerca si concentra sui modi di produzione e ricezione della teatralità nelle pratiche performative contemporanee con finalità estetiche. In particolare, sono indagate quelle pratiche che – all’interno di ecosistemi performátici – impiegano modalità di progettazione dell’azione ricorrendo a strategie e dispositivi di teatralizzazione dell’evento attraverso modelli immersivi co-partecipativi, intervenendo sui meccanismi semiocognitivi di interpretazione dello spettatore. Il concetto di ecosistemi performátici consente di pertinentizzare le differenti formazioni semiotiche che emergono dal continuum performativo della semiosfera, cogliendo i rapporti ecologici ed evolutivi che si instaurano diacronicamente tra le forme teatrali. Sono soprattutto le trasformazioni a essere comprese, restituendo all’analisi semiotica un’immagine delle arti performátiche dinamica e radicata nella cultura e nella società, e delle modalità in cui i meccanismi di base della teatralità prendono forma. Con approccio etnografico ecologico cognitivo, si affronta il tema della corporeità e dei regimi di presenza, introducendo nell’analisi relazionale il concetto di emplacement a integrazione della nozione di embodiment. È elaborato, inoltre, un modello autopoietico dell’enunciazione come atto di mostrazione, sulla metafora della “conversazione”. Nell’ecologia dell’ambiente performático tra attore e spettatore si crea un “campo interattivo”, nel quale si consuma l’enunciazione teatrale. Attraverso casi studio, si illustra come le esperienze immersive co-partecipative scardinano e riconfigurano l’insieme di norme e usi naturalizzati nella tradizione teatrale occidentale del dramma. Si giunge, infine, a concepire la relazione tra frontalità e immersività non in termini di opposizione tra contrari, bensì in rapporto di continuità quale costante del discorso performático soggetta a multiformi gradazioni. Quella tra attore e spettatore è una interazione, un dialogo, che non si gioca sulla relazione frontalità/immersività bensì su quella interattività/non-interattività dalla cui articolazione emergono le differenti e cangianti forme teatrali che popolano e popoleranno gli ecosistemi performátici.