36 resultados para non-process elements

em AMS Tesi di Dottorato - Alm@DL - Università di Bologna


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This PhD dissertation presents a profound study of the vulnerability of buildings and non-structural elements stemming from the investigation of the Mw 5.2 Lorca 2011 earthquake; which constitutes one of the most significant earthquakes in Spain. It left nine fatalities due to falling debris from reinforced concrete buildings, 394 injured and material damage valued at 800 million euros. Within this framework, the most relevant initiatives concerning the vulnerability of buildings and the exposure of Lorca are studied. This work revealed two lines of research: the elaboration of a rational method to determine the adequacy of a specific fragility curve for the particular seismic risk study of a region; and the relevance of researching the seismic performance of non-structural elements. As a consequence, firstly, a method to assess and select fragility curves for seismic risk studies from the catalogue of those available in the literature is elaborated and calibrated by means of a case study. The said methodology is based on a multidimensional index and provides a ranking that classifies the curves in terms of adequacy. Its results for the case of Lorca led to the elaboration of new fragility curves for unreinforced masonry buildings. Moreover, a simplified method to account for the unpredictable directionality of the seism in the creation of fragility curves is contributed. Secondly, the characterisation of the seismic capacity and demand of the non-structural elements that caused most of the human losses is studied. Concerning the capacity, an analytical approach derived from theoretical considerations to characterise the complete out-of-plane seismic response curve of unreinforced masonry cantilever walls is provided; as well as a simplified and more practical trilinear version of it. Concerning the demand, several methods for characterising the Floor Response Spectra of reinforced concrete buildings are tested through case studies.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Traditional software engineering approaches and metaphors fall short when applied to areas of growing relevance such as electronic commerce, enterprise resource planning, and mobile computing: such areas, in fact, generally call for open architectures that may evolve dynamically over time so as to accommodate new components and meet new requirements. This is probably one of the main reasons that the agent metaphor and the agent-oriented paradigm are gaining momentum in these areas. This thesis deals with the engineering of complex software systems in terms of the agent paradigm. This paradigm is based on the notions of agent and systems of interacting agents as fundamental abstractions for designing, developing and managing at runtime typically distributed software systems. However, today the engineer often works with technologies that do not support the abstractions used in the design of the systems. For this reason the research on methodologies becomes the basic point in the scientific activity. Currently most agent-oriented methodologies are supported by small teams of academic researchers, and as a result, most of them are in an early stage and still in the first context of mostly \academic" approaches for agent-oriented systems development. Moreover, such methodologies are not well documented and very often defined and presented only by focusing on specific aspects of the methodology. The role played by meta- models becomes fundamental for comparing and evaluating the methodologies. In fact a meta-model specifies the concepts, rules and relationships used to define methodologies. Although it is possible to describe a methodology without an explicit meta-model, formalising the underpinning ideas of the methodology in question is valuable when checking its consistency or planning extensions or modifications. A good meta-model must address all the different aspects of a methodology, i.e. the process to be followed, the work products to be generated and those responsible for making all this happen. In turn, specifying the work products that must be developed implies dening the basic modelling building blocks from which they are built. As a building block, the agent abstraction alone is not enough to fully model all the aspects related to multi-agent systems in a natural way. In particular, different perspectives exist on the role that environment plays within agent systems: however, it is clear at least that all non-agent elements of a multi-agent system are typically considered to be part of the multi-agent system environment. The key role of environment as a first-class abstraction in the engineering of multi-agent system is today generally acknowledged in the multi-agent system community, so environment should be explicitly accounted for in the engineering of multi-agent system, working as a new design dimension for agent-oriented methodologies. At least two main ingredients shape the environment: environment abstractions - entities of the environment encapsulating some functions -, and topology abstractions - entities of environment that represent the (either logical or physical) spatial structure. In addition, the engineering of non-trivial multi-agent systems requires principles and mechanisms for supporting the management of the system representation complexity. These principles lead to the adoption of a multi-layered description, which could be used by designers to provide different levels of abstraction over multi-agent systems. The research in these fields has lead to the formulation of a new version of the SODA methodology where environment abstractions and layering principles are exploited for en- gineering multi-agent systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

La contaminazione chimica rappresenta uno dei rischi principali per la sicurezza alimentare e può arrecare anche gravi danni alla salute umana. Rientrano in questa tesi di dottorato tre famiglie di contaminanti: Micotossine, Metalli e Insetticidi. La ricerca di aflatossina B1 è stata effettuata su 90 confezioni di farina, sia biologici sia convenzionali. La presenza della micotossina è stata rilevata solo nelle farine di mais. Solo un campione di produzione convenzionale ha superato il limite di 2 ppb definito per legge. Il dato di maggior rilievo è stato che il quantitativo di 5 grammi di campionamento si è dimostrato non rappresentativo sul totale della confezione commerciale di farina. Più attendibile si è invece dimostrato un campionamento di 20 grammi. L’aflatossina M1 è stata ricercata in 58 campioni di latte di cui 35 sono risultati positivi. Tuttavia, i livelli riscontrati erano costantemente inferiori al limite previsto per legge. Sono stati sottoposti a estrazione e purificazione, e analizzati con metodica HPLC-FL per la ricerca di Ocratossina A, 114 campioni di bile, 35 campioni di plasma, 40 campioni di rene prelevati da polli in Giordania. Le analisi hanno fornito risultati costantemente negativi. Sono stati analizzati 72 campioni (30 di muscolo, 29 di fegato e 13 di rene) prelevati da 30 bovini nel macello di Irbid (Giordania), di età compresa tra 8 e 30 mesi e provenienti da allevamenti diversi, per la ricerca di 13 elementi essenziali e non essenziali. In questo studio nessun campione supera i livelli massimi stabiliti dalla normativa europea per quanto riguarda gli elementi considerati. Infine, sono stati analizzati 37 campioni di latte ovino e 31 campioni di latte bovino, prelevati in Giordania in diversi allevamenti, per la ricerca di 4 neonicotinoidi (imidacloprid, acetamiprid, thiamethoxam e thiacloprid). I campioni, analizzati con sistema HPLC/MS/MS, sono risultati costantemente negativi ai quattro neonicotinoidi ricercati.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Eukaryotic ribosomal DNA constitutes a multi gene family organized in a cluster called nucleolar organizer region (NOR); this region is composed usually by hundreds to thousands of tandemly repeated units. Ribosomal genes, being repeated sequences, evolve following the typical pattern of concerted evolution. The autonomous retroelement R2 inserts in the ribosomal gene 28S, leading to defective 28S rDNA genes. R2 element, being a retrotransposon, performs its activity in the genome multiplying its copy number through a “copy and paste” mechanism called target primed reverse transcription. It consists in the retrotranscription of the element’s mRNA into DNA, then the DNA is integrated in the target site. Since the retrotranscription can be interrupted, but the integration will be carried out anyway, truncated copies of the element will also be present in the genome. The study of these truncated variants is a tool to examine the activity of the element. R2 phylogeny appears, in general, not consistent with that of its hosts, except some cases (e.g. Drosophila spp. and Reticulitermes spp.); moreover R2 is absent in some species (Fugu rubripes, human, mouse, etc.), while other species have more R2 lineages in their genome (the turtle Mauremys reevesii, the Japanese beetle Popilia japonica, etc). R2 elements here presented are isolated in 4 species of notostracan branchiopods and in two species of stick insects, whose reproductive strategies range from strict gonochorism to unisexuality. From sequencing data emerges that in Triops cancriformis (Spanish gonochoric population), in Lepidurus arcticus (two putatively unisexual populations from Iceland) and in Bacillus rossius (gonochoric population from Capalbio) the R2 elements are complete and encode functional proteins, reflecting the general features of this family of transposable elements. On the other hand, R2 from Italian and Austrian populations of T. cancriformis (respectively unisexual and hermaphroditic), Lepidurus lubbocki (two elements within the same Italian population, gonochoric but with unfunctional males) and Bacillus grandii grandii (gonochoric population from Ponte Manghisi) have sequences that encode incomplete or non-functional proteins in which it is possible to recognize only part of the characteristic domains. In Lepidurus couesii (Italian gonochoric populations) different elements were found as in L. lubbocki, and the sequencing is still in progress. Two hypothesis are given to explain the inconsistency of R2/host phylogeny: vertical inheritance of the element followed by extinction/diversification or horizontal transmission. My data support previous study that state the vertical transmission as the most likely explanation; nevertheless horizontal transfer events can’t be excluded. I also studied the element’s activity in Spanish populations of T. cancriformis, in L. lubbocki, in L. arcticus and in gonochoric and parthenogenetic populations of B. rossius. In gonochoric populations of T. cancriformis and B. rossius I found that each individual has its own private set of truncated variants. The situation is the opposite for the remaining hermaphroditic/parthenogenetic species and populations, all individuals sharing – in the so far analyzed samples - the majority of variants. This situation is very interesting, because it isn’t concordant with the Muller’s ratchet theory that hypothesizes the parthenogenetic populations being either devoided of transposable elements or TEs overloaded. My data suggest a possible epigenetic mechanism that can block the retrotransposon activity, and in this way deleterious mutations don’t accumulate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The wide use of e-technologies represents a great opportunity for underserved segments of the population, especially with the aim of reintegrating excluded individuals back into society through education. This is particularly true for people with different types of disabilities who may have difficulties while attending traditional on-site learning programs that are typically based on printed learning resources. The creation and provision of accessible e-learning contents may therefore become a key factor in enabling people with different access needs to enjoy quality learning experiences and services. Another e-learning challenge is represented by m-learning (which stands for mobile learning), which is emerging as a consequence of mobile terminals diffusion and provides the opportunity to browse didactical materials everywhere, outside places that are traditionally devoted to education. Both such situations share the need to access materials in limited conditions and collide with the growing use of rich media in didactical contents, which are designed to be enjoyed without any restriction. Nowadays, Web-based teaching makes great use of multimedia technologies, ranging from Flash animations to prerecorded video-lectures. Rich media in e-learning can offer significant potential in enhancing the learning environment, through helping to increase access to education, enhance the learning experience and support multiple learning styles. Moreover, they can often be used to improve the structure of Web-based courses. These highly variegated and structured contents may significantly improve the quality and the effectiveness of educational activities for learners. For example, rich media contents allow us to describe complex concepts and process flows. Audio and video elements may be utilized to add a “human touch” to distance-learning courses. Finally, real lectures may be recorded and distributed to integrate or enrich on line materials. A confirmation of the advantages of these approaches can be seen in the exponential growth of video-lecture availability on the net, due to the ease of recording and delivering activities which take place in a traditional classroom. Furthermore, the wide use of assistive technologies for learners with disabilities injects new life into e-learning systems. E-learning allows distance and flexible educational activities, thus helping disabled learners to access resources which would otherwise present significant barriers for them. For instance, students with visual impairments have difficulties in reading traditional visual materials, deaf learners have trouble in following traditional (spoken) lectures, people with motion disabilities have problems in attending on-site programs. As already mentioned, the use of wireless technologies and pervasive computing may really enhance the educational learner experience by offering mobile e-learning services that can be accessed by handheld devices. This new paradigm of educational content distribution maximizes the benefits for learners since it enables users to overcome constraints imposed by the surrounding environment. While certainly helpful for users without disabilities, we believe that the use of newmobile technologies may also become a fundamental tool for impaired learners, since it frees them from sitting in front of a PC. In this way, educational activities can be enjoyed by all the users, without hindrance, thus increasing the social inclusion of non-typical learners. While the provision of fully accessible and portable video-lectures may be extremely useful for students, it is widely recognized that structuring and managing rich media contents for mobile learning services are complex and expensive tasks. Indeed, major difficulties originate from the basic need to provide a textual equivalent for each media resource composing a rich media Learning Object (LO). Moreover, tests need to be carried out to establish whether a given LO is fully accessible to all kinds of learners. Unfortunately, both these tasks are truly time-consuming processes, depending on the type of contents the teacher is writing and on the authoring tool he/she is using. Due to these difficulties, online LOs are often distributed as partially accessible or totally inaccessible content. Bearing this in mind, this thesis aims to discuss the key issues of a system we have developed to deliver accessible, customized or nomadic learning experiences to learners with different access needs and skills. To reduce the risk of excluding users with particular access capabilities, our system exploits Learning Objects (LOs) which are dynamically adapted and transcoded based on the specific needs of non-typical users and on the barriers that they can encounter in the environment. The basic idea is to dynamically adapt contents, by selecting them from a set of media resources packaged in SCORM-compliant LOs and stored in a self-adapting format. The system schedules and orchestrates a set of transcoding processes based on specific learner needs, so as to produce a customized LO that can be fully enjoyed by any (impaired or mobile) student.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The ideal approach for the long term treatment of intestinal disorders, such as inflammatory bowel disease (IBD), is represented by a safe and well tolerated therapy able to reduce mucosal inflammation and maintain homeostasis of the intestinal microbiota. A combined therapy with antimicrobial agents, to reduce antigenic load, and immunomodulators, to ameliorate the dysregulated responses, followed by probiotic supplementation has been proposed. Because of the complementary mechanisms of action of antibiotics and probiotics, a combined therapeutic approach would give advantages in terms of enlargement of the antimicrobial spectrum, due to the barrier effect of probiotic bacteria, and limitation of some side effects of traditional chemiotherapy (i.e. indiscriminate decrease of aggressive and protective intestinal bacteria, altered absorption of nutrient elements, allergic and inflammatory reactions). Rifaximin (4-deoxy-4’-methylpyrido[1’,2’-1,2]imidazo[5,4-c]rifamycin SV) is a product of synthesis experiments designed to modify the parent compound, rifamycin, in order to achieve low gastrointestinal absorption while retaining good antibacterial activity. Both experimental and clinical pharmacology clearly show that this compound is a non systemic antibiotic with a broad spectrum of antibacterial action, covering Gram-positive and Gram-negative organisms, both aerobes and anaerobes. Being virtually non absorbed, its bioavailability within the gastrointestinal tract is rather high with intraluminal and faecal drug concentrations that largely exceed the MIC values observed in vitro against a wide range of pathogenic microorganisms. The gastrointestinal tract represents therefore the primary therapeutic target and gastrointestinal infections the main indication. The little value of rifaximin outside the enteric area minimizes both antimicrobial resistance and systemic adverse events. Fermented dairy products enriched with probiotic bacteria have developed into one of the most successful categories of functional foods. Probiotics are defined as “live microorganisms which, when administered in adequate amounts, confer a health benefit on the host” (FAO/WHO, 2002), and mainly include Lactobacillus and Bifidobacterium species. Probiotic bacteria exert a direct effect on the intestinal microbiota of the host and contribute to organoleptic, rheological and nutritional properties of food. Administration of pharmaceutical probiotic formula has been associated with therapeutic effects in treatment of diarrhoea, constipation, flatulence, enteropathogens colonization, gastroenteritis, hypercholesterolemia, IBD, such as ulcerative colitis (UC), Crohn’s disease, pouchitis and irritable bowel syndrome. Prerequisites for probiotics are to be effective and safe. The characteristics of an effective probiotic for gastrointestinal tract disorders are tolerance to upper gastrointestinal environment (resistance to digestion by enteric or pancreatic enzymes, gastric acid and bile), adhesion on intestinal surface to lengthen the retention time, ability to prevent the adherence, establishment and/or replication of pathogens, production of antimicrobial substances, degradation of toxic catabolites by bacterial detoxifying enzymatic activities, and modulation of the host immune responses. This study was carried out using a validated three-stage fermentative continuous system and it is aimed to investigate the effect of rifaximin on the colonic microbial flora of a healthy individual, in terms of bacterial composition and production of fermentative metabolic end products. Moreover, this is the first study that investigates in vitro the impact of the simultaneous administration of the antibiotic rifaximin and the probiotic B. lactis BI07 on the intestinal microbiota. Bacterial groups of interest were evaluated using culture-based methods and molecular culture-independent techniques (FISH, PCR-DGGE). Metabolic outputs in terms of SCFA profiles were determined by HPLC analysis. Collected data demonstrated that rifaximin as well as antibiotic and probiotic treatment did not change drastically the intestinal microflora, whereas bacteria belonging to Bifidobacterium and Lactobacillus significantly increase over the course of the treatment, suggesting a spontaneous upsurge of rifaximin resistance. These results are in agreement with a previous study, in which it has been demonstrated that rifaximin administration in patients with UC, affects the host with minor variations of the intestinal microflora, and that the microbiota is restored over a wash-out period. In particular, several Bifidobacterium rifaximin resistant mutants could be isolated during the antibiotic treatment, but they disappeared after the antibiotic suspension. Furthermore, bacteria belonging to Atopobium spp. and E. rectale/Clostridium cluster XIVa increased significantly after rifaximin and probiotic treatment. Atopobium genus and E. rectale/Clostridium cluster XIVa are saccharolytic, butyrate-producing bacteria, and for these characteristics they are widely considered health-promoting microorganisms. The absence of major variations in the intestinal microflora of a healthy individual and the significant increase in probiotic and health-promoting bacteria concentrations support the rationale of the administration of rifaximin as efficacious and non-dysbiosis promoting therapy and suggest the efficacy of an antibiotic/probiotic combined treatment in several gut pathologies, such as IBD. To assess the use of an antibiotic/probiotic combination for clinical management of intestinal disorders, genetic, proteomic and physiologic approaches were employed to elucidate molecular mechanisms determining rifaximin resistance in Bifidobacterium, and the expected interactions occurring in the gut between these bacteria and the drug. The ability of an antimicrobial agent to select resistance is a relevant factor that affects its usefulness and may diminish its useful life. Rifaximin resistance phenotype was easily acquired by all bifidobacteria analyzed [type strains of the most representative intestinal bifidobacterial species (B. infantis, B. breve, B. longum, B. adolescentis and B. bifidum) and three bifidobacteria included in a pharmaceutical probiotic preparation (B. lactis BI07, B. breve BBSF and B. longum BL04)] and persisted for more than 400 bacterial generations in the absence of selective pressure. Exclusion of any reversion phenomenon suggested two hypotheses: (i) stable and immobile genetic elements encode resistance; (ii) the drug moiety does not act as an inducer of the resistance phenotype, but enables selection of resistant mutants. Since point mutations in rpoB have been indicated as representing the principal factor determining rifampicin resistance in E. coli and M. tuberculosis, whether a similar mechanism also occurs in Bifidobacterium was verified. The analysis of a 129 bp rpoB core region of several wild-type and resistant bifidobacteria revealed five different types of miss-sense mutations in codons 513, 516, 522 and 529. Position 529 was a novel mutation site, not previously described, and position 522 appeared interesting for both the double point substitutions and the heterogeneous profile of nucleotide changes. The sequence heterogeneity of codon 522 in Bifidobacterium leads to hypothesize an indirect role of its encoded amino acid in the binding with the rifaximin moiety. These results demonstrated the chromosomal nature of rifaximin resistance in Bifidobacterium, minimizing risk factors for horizontal transmission of resistance elements between intestinal microbial species. Further proteomic and physiologic investigations were carried out using B. lactis BI07, component of a pharmaceutical probiotic preparation, as a model strain. The choice of this strain was determined based on the following elements: (i) B. lactis BI07 is able to survive and persist in the gut; (ii) a proteomic overview of this strain has been recently reported. The involvement of metabolic changes associated with rifaximin resistance was investigated by proteomic analysis performed with two-dimensional electrophoresis and mass spectrometry. Comparative proteomic mapping of BI07-wt and BI07-res revealed that most differences in protein expression patterns were genetically encoded rather than induced by antibiotic exposure. In particular, rifaximin resistance phenotype was characterized by increased expression levels of stress proteins. Overexpression of stress proteins was expected, as they represent a common non specific response by bacteria when stimulated by different shock conditions, including exposure to toxic agents like heavy metals, oxidants, acids, bile salts and antibiotics. Also, positive transcription regulators were found to be overexpressed in BI07-res, suggesting that bacteria could activate compensatory mechanisms to assist the transcription process in the presence of RNA polymerase inhibitors. Other differences in expression profiles were related to proteins involved in central metabolism; these modifications suggest metabolic disadvantages of resistant mutants in comparison with sensitive bifidobacteria in the gut environment, without selective pressure, explaining their disappearance from faeces of patients with UC after interruption of antibiotic treatment. The differences observed between BI07-wt e BI07-res proteomic patterns, as well as the high frequency of silent mutations reported for resistant mutants of Bifidobacterium could be the consequences of an increased mutation rate, mechanism which may lead to persistence of resistant bacteria in the population. However, the in vivo disappearance of resistant mutants in absence of selective pressure, allows excluding the upsurge of compensatory mutations without loss of resistance. Furthermore, the proteomic characterization of the resistant phenotype suggests that rifaximin resistance is associated with a reduced bacterial fitness in B. lactis BI07-res, supporting the hypothesis of a biological cost of antibiotic resistance in Bifidobacterium. The hypothesis of rifaximin inactivation by bacterial enzymatic activities was verified by using liquid chromatography coupled with tandem mass spectrometry. Neither chemical modifications nor degradation derivatives of the rifaximin moiety were detected. The exclusion of a biodegradation pattern for the drug was further supported by the quantitative recovery in BI07-res culture fractions of the total rifaximin amount (100 μg/ml) added to the culture medium. To confirm the main role of the mutation on the β chain of RNA polymerase in rifaximin resistance acquisition, transcription activity of crude enzymatic extracts of BI07-res cells was evaluated. Although the inhibition effects of rifaximin on in vitro transcription were definitely higher for BI07-wt than for BI07-res, a partial resistance of the mutated RNA polymerase at rifaximin concentrations > 10 μg/ml was supposed, on the basis of the calculated differences in inhibition percentages between BI07-wt and BI07-res. By considering the resistance of entire BI07-res cells to rifaximin concentrations > 100 μg/ml, supplementary resistance mechanisms may take place in vivo. A barrier for the rifaximin uptake in BI07-res cells was suggested in this study, on the basis of the major portion of the antibiotic found to be bound to the cellular pellet respect to the portion recovered in the cellular lysate. Related to this finding, a resistance mechanism involving changes of membrane permeability was supposed. A previous study supports this hypothesis, demonstrating the involvement of surface properties and permeability in natural resistance to rifampicin in mycobacteria, isolated from cases of human infection, which possessed a rifampicin-susceptible RNA polymerase. To understand the mechanism of membrane barrier, variations in percentage of saturated and unsaturated FAs and their methylation products in BI07-wt and BI07-res membranes were investigated. While saturated FAs confer rigidity to membrane and resistance to stress agents, such as antibiotics, a high level of lipid unsaturation is associated with high fluidity and susceptibility to stresses. Thus, the higher percentage of saturated FAs during the stationary phase of BI07-res could represent a defence mechanism of mutant cells to prevent the antibiotic uptake. Furthermore, the increase of CFAs such as dihydrosterculic acid during the stationary phase of BI07-res suggests that this CFA could be more suitable than its isomer lactobacillic acid to interact with and prevent the penetration of exogenous molecules including rifaximin. Finally, the impact of rifaximin on immune regulatory functions of the gut was evaluated. It has been suggested a potential anti-inflammatory effect of rifaximin, with reduced secretion of IFN-γ in a rodent model of colitis. Analogously, it has been reported a significant decrease in IL-8, MCP-1, MCP-3 e IL-10 levels in patients affected by pouchitis, treated with a combined therapy of rifaximin and ciprofloxacin. Since rifaximin enables in vivo and in vitro selection of Bifidobacterium resistant mutants with high frequency, the immunomodulation activities of rifaximin associated with a B. lactis resistant mutant were also taken into account. Data obtained from PBMC stimulation experiments suggest the following conclusions: (i) rifaximin does not exert any effect on production of IL-1β, IL-6 and IL-10, whereas it weakly stimulates production of TNF-α; (ii) B. lactis appears as a good inducer of IL-1β, IL-6 and TNF-α; (iii) combination of BI07-res and rifaximin exhibits a lower stimulation effect than BI07-res alone, especially for IL-6. These results confirm the potential anti-inflammatory effect of rifaximin, and are in agreement with several studies that report a transient pro-inflammatory response associated with probiotic administration. The understanding of the molecular factors determining rifaximin resistance in the genus Bifidobacterium assumes an applicative significance at pharmaceutical and medical level, as it represents the scientific basis to justify the simultaneous use of the antibiotic rifaximin and probiotic bifidobacteria in the clinical treatment of intestinal disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is not unknown that the evolution of firm theories has been developed along a path paved by an increasing awareness of the organizational structure importance. From the early “neoclassical” conceptualizations that intended the firm as a rational actor whose aim is to produce that amount of output, given the inputs at its disposal and in accordance to technological or environmental constraints, which maximizes the revenue (see Boulding, 1942 for a past mid century state of the art discussion) to the knowledge based theory of the firm (Nonaka & Takeuchi, 1995; Nonaka & Toyama, 2005), which recognizes in the firm a knnowledge creating entity, with specific organizational capabilities (Teece, 1996; Teece & Pisano, 1998) that allow to sustaine competitive advantages. Tracing back a map of the theory of the firm evolution, taking into account the several perspectives adopted in the history of thought, would take the length of many books. Because of that a more fruitful strategy is circumscribing the focus of the description of the literature evolution to one flow connected to a crucial question about the nature of firm’s behaviour and about the determinants of competitive advantages. In so doing I adopt a perspective that allows me to consider the organizational structure of the firm as an element according to which the different theories can be discriminated. The approach adopted starts by considering the drawbacks of the standard neoclassical theory of the firm. Discussing the most influential theoretical approaches I end up with a close examination of the knowledge based perspective of the firm. Within this perspective the firm is considered as a knowledge creating entity that produce and mange knowledge (Nonaka, Toyama, & Nagata, 2000; Nonaka & Toyama, 2005). In a knowledge intensive organization, knowledge is clearly embedded for the most part in the human capital of the individuals that compose such an organization. In a knowledge based organization, the management, in order to cope with knowledge intensive productions, ought to develop and accumulate capabilities that shape the organizational forms in a way that relies on “cross-functional processes, extensive delayering and empowerment” (Foss 2005, p.12). This mechanism contributes to determine the absorptive capacity of the firm towards specific technologies and, in so doing, it also shape the technological trajectories along which the firm moves. After having recognized the growing importance of the firm’s organizational structure in the theoretical literature concerning the firm theory, the subsequent point of the analysis is that of providing an overview of the changes that have been occurred at micro level to the firm’s organization of production. The economic actors have to deal with challenges posed by processes of internationalisation and globalization, increased and increasing competitive pressure of less developed countries on low value added production activities, changes in technologies and increased environmental turbulence and volatility. As a consequence, it has been widely recognized that the main organizational models of production that fitted well in the 20th century are now partially inadequate and processes aiming to reorganize production activities have been widespread across several economies in recent years. Recently, the emergence of a “new” form of production organization has been proposed both by scholars, practitioners and institutions: the most prominent characteristic of such a model is its recognition of the importance of employees commitment and involvement. As a consequence it is characterized by a strong accent on the human resource management and on those practices that aim to widen the autonomy and responsibility of the workers as well as increasing their commitment to the organization (Osterman, 1994; 2000; Lynch, 2007). This “model” of production organization is by many defined as High Performance Work System (HPWS). Despite the increasing diffusion of workplace practices that may be inscribed within the concept of HPWS in western countries’ companies, it is an hazard, to some extent, to speak about the emergence of a “new organizational paradigm”. The discussion about organizational changes and the diffusion of HPWP the focus cannot abstract from a discussion about the industrial relations systems, with a particular accent on the employment relationships, because of their relevance, in the same way as production organization, in determining two major outcomes of the firm: innovation and economic performances. The argument is treated starting from the issue of the Social Dialogue at macro level, both in an European perspective and Italian perspective. The model of interaction between the social parties has repercussions, at micro level, on the employment relationships, that is to say on the relations between union delegates and management or workers and management. Finding economic and social policies capable of sustaining growth and employment within a knowledge based scenario is likely to constitute the major challenge for the next generation of social pacts, which are the main social dialogue outcomes. As Acocella and Leoni (2007) put forward the social pacts may constitute an instrument to trade wage moderation for high intensity in ICT, organizational and human capital investments. Empirical evidence, especially focused on the micro level, about the positive relation between economic growth and new organizational designs coupled with ICT adoption and non adversarial industrial relations is growing. Partnership among social parties may become an instrument to enhance firm competitiveness. The outcome of the discussion is the integration of organizational changes and industrial relations elements within a unified framework: the HPWS. Such a choice may help in disentangling the potential existence of complementarities between these two aspects of the firm internal structure on economic and innovative performance. With the third chapter starts the more original part of the thesis. The data utilized in order to disentangle the relations between HPWS practices, innovation and economic performance refer to the manufacturing firms of the Reggio Emilia province with more than 50 employees. The data have been collected through face to face interviews both to management (199 respondents) and to union representatives (181 respondents). Coupled with the cross section datasets a further data source is constituted by longitudinal balance sheets (1994-2004). Collecting reliable data that in turn provide reliable results needs always a great effort to which are connected uncertain results. Data at micro level are often subjected to a trade off: the wider is the geographical context to which the population surveyed belong the lesser is the amount of information usually collected (low level of resolution); the narrower is the focus on specific geographical context, the higher is the amount of information usually collected (high level of resolution). For the Italian case the evidence about the diffusion of HPWP and their effects on firm performances is still scanty and usually limited to local level studies (Cristini, et al., 2003). The thesis is also devoted to the deepening of an argument of particular interest: the existence of complementarities between the HPWS practices. It has been widely shown by empirical evidence that when HPWP are adopted in bundles they are more likely to impact on firm’s performances than when adopted in isolation (Ichniowski, Prennushi, Shaw, 1997). Is it true also for the local production system of Reggio Emilia? The empirical analysis has the precise aim of providing evidence on the relations between the HPWS dimensions and the innovative and economic performances of the firm. As far as the first line of analysis is concerned it must to be stressed the fundamental role that innovation plays in the economy (Geroski & Machin, 1993; Stoneman & Kwoon 1994, 1996; OECD, 2005; EC, 2002). On this point the evidence goes from the traditional innovations, usually approximated by R&D investment expenditure or number of patents, to the introduction and adoption of ICT, in the recent years (Brynjolfsson & Hitt, 2000). If innovation is important then it is critical to analyse its determinants. In this work it is hypothesised that organizational changes and firm level industrial relations/employment relations aspects that can be put under the heading of HPWS, influence the propensity to innovate in product, process and quality of the firm. The general argument may goes as follow: changes in production management and work organization reconfigure the absorptive capacity of the firm towards specific technologies and, in so doing, they shape the technological trajectories along which the firm moves; cooperative industrial relations may lead to smother adoption of innovations, because not contrasted by unions. From the first empirical chapter emerges that the different types of innovations seem to respond in different ways to the HPWS variables. The underlying processes of product, process and quality innovations are likely to answer to different firm’s strategies and needs. Nevertheless, it is possible to extract some general results in terms of the most influencing HPWS factors on innovative performance. The main three aspects are training coverage, employees involvement and the diffusion of bonuses. These variables show persistent and significant relations with all the three innovation types. The same do the components having such variables at their inside. In sum the aspects of the HPWS influence the propensity to innovate of the firm. At the same time, emerges a quite neat (although not always strong) evidence of complementarities presence between HPWS practices. In terns of the complementarity issue it can be said that some specific complementarities exist. Training activities, when adopted and managed in bundles, are related to the propensity to innovate. Having a sound skill base may be an element that enhances the firm’s capacity to innovate. It may enhance both the capacity to absorbe exogenous innovation and the capacity to endogenously develop innovations. The presence and diffusion of bonuses and the employees involvement also spur innovative propensity. The former because of their incentive nature and the latter because direct workers participation may increase workers commitment to the organizationa and thus their willingness to support and suggest inovations. The other line of analysis provides results on the relation between HPWS and economic performances of the firm. There have been a bulk of international empirical studies on the relation between organizational changes and economic performance (Black & Lynch 2001; Zwick 2004; Janod & Saint-Martin 2004; Huselid 1995; Huselid & Becker 1996; Cappelli & Neumark 2001), while the works aiming to capture the relations between economic performance and unions or industrial relations aspects are quite scant (Addison & Belfield, 2001; Pencavel, 2003; Machin & Stewart, 1990; Addison, 2005). In the empirical analysis the integration of the two main areas of the HPWS represent a scarcely exploited approach in the panorama of both national and international empirical studies. As remarked by Addison “although most analysis of workers representation and employee involvement/high performance work practices have been conducted in isolation – while sometimes including the other as controls – research is beginning to consider their interactions” (Addison, 2005, p.407). The analysis conducted exploiting temporal lags between dependent and covariates, possibility given by the merger of cross section and panel data, provides evidence in favour of the existence of HPWS practices impact on firm’s economic performance, differently measured. Although it does not seem to emerge robust evidence on the existence of complementarities among HPWS aspects on performances there is evidence of a general positive influence of the single practices. The results are quite sensible to the time lags, inducing to hypothesize that time varying heterogeneity is an important factor in determining the impact of organizational changes on economic performance. The implications of the analysis can be of help both to management and local level policy makers. Although the results are not simply extendible to other local production systems it may be argued that for contexts similar to the Reggio Emilia province, characterized by the presence of small and medium enterprises organized in districts and by a deep rooted unionism, with strong supporting institutions, the results and the implications here obtained can also fit well. However, a hope for future researches on the subject treated in the present work is that of collecting good quality information over wider geographical areas, possibly at national level, and repeated in time. Only in this way it is possible to solve the Gordian knot about the linkages between innovation, performance, high performance work practices and industrial relations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This artwork reports on two different projects that were carried out during the three years of Doctor of the Philosophy course. In the first years a project regarding Capacitive Pressure Sensors Array for Aerodynamic Applications was developed in the Applied Aerodynamic research team of the Second Faculty of Engineering, University of Bologna, Forlì, Italy, and in collaboration with the ARCES laboratories of the same university. Capacitive pressure sensors were designed and fabricated, investigating theoretically and experimentally the sensor’s mechanical and electrical behaviours by means of finite elements method simulations and by means of wind tunnel tests. During the design phase, the sensor figures of merit are considered and evaluated for specific aerodynamic applications. The aim of this work is the production of low cost MEMS-alternative devices suitable for a sensor network to be implemented in air data system. The last two year was dedicated to a project regarding Wireless Pressure Sensor Network for Nautical Applications. Aim of the developed sensor network is to sense the weak pressure field acting on the sail plan of a full batten sail by means of instrumented battens, providing a real time differential pressure map over the entire sail surface. The wireless sensor network and the sensing unit were designed, fabricated and tested in the faculty laboratories. A static non-linear coupled mechanical-electrostatic simulation, has been developed to predict the pressure versus capacitance static characteristic suitable for the transduction process and to tune the geometry of the transducer to reach the required resolution, sensitivity and time response in the appropriate full scale pressure input A time dependent viscoelastic error model has been inferred and developed by means of experimental data in order to model, predict and reduce the inaccuracy bound due to the viscolelastic phenomena affecting the Mylar® polyester film used for the sensor diaphragm. The development of the two above mentioned subjects are strictly related but presently separately in this artwork.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work provides a forward step in the study and comprehension of the relationships between stochastic processes and a certain class of integral-partial differential equation, which can be used in order to model anomalous diffusion and transport in statistical physics. In the first part, we brought the reader through the fundamental notions of probability and stochastic processes, stochastic integration and stochastic differential equations as well. In particular, within the study of H-sssi processes, we focused on fractional Brownian motion (fBm) and its discrete-time increment process, the fractional Gaussian noise (fGn), which provide examples of non-Markovian Gaussian processes. The fGn, together with stationary FARIMA processes, is widely used in the modeling and estimation of long-memory, or long-range dependence (LRD). Time series manifesting long-range dependence, are often observed in nature especially in physics, meteorology, climatology, but also in hydrology, geophysics, economy and many others. We deepely studied LRD, giving many real data examples, providing statistical analysis and introducing parametric methods of estimation. Then, we introduced the theory of fractional integrals and derivatives, which indeed turns out to be very appropriate for studying and modeling systems with long-memory properties. After having introduced the basics concepts, we provided many examples and applications. For instance, we investigated the relaxation equation with distributed order time-fractional derivatives, which describes models characterized by a strong memory component and can be used to model relaxation in complex systems, which deviates from the classical exponential Debye pattern. Then, we focused in the study of generalizations of the standard diffusion equation, by passing through the preliminary study of the fractional forward drift equation. Such generalizations have been obtained by using fractional integrals and derivatives of distributed orders. In order to find a connection between the anomalous diffusion described by these equations and the long-range dependence, we introduced and studied the generalized grey Brownian motion (ggBm), which is actually a parametric class of H-sssi processes, which have indeed marginal probability density function evolving in time according to a partial integro-differential equation of fractional type. The ggBm is of course Non-Markovian. All around the work, we have remarked many times that, starting from a master equation of a probability density function f(x,t), it is always possible to define an equivalence class of stochastic processes with the same marginal density function f(x,t). All these processes provide suitable stochastic models for the starting equation. Studying the ggBm, we just focused on a subclass made up of processes with stationary increments. The ggBm has been defined canonically in the so called grey noise space. However, we have been able to provide a characterization notwithstanding the underline probability space. We also pointed out that that the generalized grey Brownian motion is a direct generalization of a Gaussian process and in particular it generalizes Brownain motion and fractional Brownain motion as well. Finally, we introduced and analyzed a more general class of diffusion type equations related to certain non-Markovian stochastic processes. We started from the forward drift equation, which have been made non-local in time by the introduction of a suitable chosen memory kernel K(t). The resulting non-Markovian equation has been interpreted in a natural way as the evolution equation of the marginal density function of a random time process l(t). We then consider the subordinated process Y(t)=X(l(t)) where X(t) is a Markovian diffusion. The corresponding time-evolution of the marginal density function of Y(t) is governed by a non-Markovian Fokker-Planck equation which involves the same memory kernel K(t). We developed several applications and derived the exact solutions. Moreover, we considered different stochastic models for the given equations, providing path simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La ricerca si pone come obbiettivo principale quello di individuare gli strumenti in grado di controllare la qualità di una progettazione specifica che risponde alle forti richieste della domanda turistica di un territorio. Parte dalle più semplici teorie che inquadrano una costante condizione dell’uomo, “il VIAGGIARE”. La ricerca si pone come primo interrogativo quello definire una “dimensione” in cui le persone viaggiano, dove il concetto fisico di spazio dedicato alla vita si è spostato come e quanto si sposta la gente. Esiste una sorta di macroluogo (destinazione) che comprende tutti gli spazi dove la gente arriva e da cui spesso riparte. Pensare all'architettura dell’ospitalità significa indagare e comprendere come la casa non è più il solo luogo dove la gente abita. La ricerca affonda le proprie tesi sull’importanza dei “luoghi” appartenenti ad un territorio e come essi debbano riappropriarsi, attraverso un percorso progettuale, della loro più stretta vocazione attrattiva. Così come si sviluppa un’architettura dello stare, si manifesta un’architettura dello spostarsi e tali architetture si confondono e si integrano ad un territorio che per sua natura è esso stesso attrattivo. L’origine terminologica di nomadismo è passaggio necessario per la comprensione di una nuova dimensione architettonica legata a concetti quali mobilità e abitare. Si indaga pertanto all’interno della letteratura “diasporica”, in cui compaiono le prime configurazioni legate alla provvisorietà e alle costruzioni “erranti”. In sintesi, dopo aver posizionato e classificato il fenomeno turistico come nuova forma dell’abitare, senza il quale non si potrebbe svolgere una completa programmazione territoriale in quanto fenomeno oramai imprescindibile, la ricerca procede con l’individuazione di un ambito inteso come strumento di indagine sulle relazioni tra le diverse categorie e “tipologie” turistiche. La Riviera Romagnola è sicuramente molto famosa per la sua ospitalità e per le imponenti infrastrutture turistiche ma a livello industriale non è meno famosa per il porto di Ravenna che costituisce un punto di riferimento logistico per lo scambio di merci e materie prime via mare, oltre che essere, in tutta la sua estensione, caso di eccellenza. La provincia di Ravenna mette insieme tutti i fattori che servono a soddisfare le Total Leisure Experience, cioè esperienze di totale appagamento durante la vacanza. Quello che emerge dalle considerazioni svolte sul territorio ravennate è che il turista moderno non va più in cerca di una vacanza monotematica, in cui stare solo in spiaggia o occuparsi esclusivamente di monumenti e cultura. La richiesta è quella di un piacere procurato da una molteplicità di elementi. Pensiamo ad un distretto turistico dove l’offerta, oltre alla spiaggia o gli itinerari culturali, è anche occasione per fare sport o fitness, per rilassarsi in luoghi sereni, per gustare o acquistare cibi tipici e, allo stesso tempo, godere degli stessi servizi che una persona può avere a disposizione nella propria casa. Il percorso, finalizzato a definire un metodo di progettazione dell’ospitalità, parte dalla acquisizione delle esperienze nazionali ed internazionali avvenute negli ultimi dieci anni. La suddetta fase di ricerca “tipologica” si è conclusa in una valutazione critica che mette in evidenza punti di forza e punti di debolezza delle esperienze prese in esame. La conclusione di questa esplorazione ha prodotto una prima stesura degli “obbiettivi concettuali” legati alla elaborazione di un modello architettonico. Il progetto di ricerca in oggetto converge sul percorso tracciato dai Fiumi Uniti in Ravenna. Tale scelta consente di prendere in considerazione un parametro che mostri fattori di continuità tra costa e città, tra turismo balneare e turismo culturale, considerato quindi come potenziale strumento di connessione tra realtà spesso omologhe o complementari, in vista di una implementazione turistica che il progetto di ricerca ha come primo tra i suoi obiettivi. Il tema dell’architettura dell’ospitalità, che in questo caso si concretizza nell’idea di sperimentare l’ALBERGO DIFFUSO, è quello che permette di evidenziare al meglio la forma specifica della cultura locale, salvandone la vocazione universale. La proposta progettuale si articola in uno studio consequenziale ed organico in grado di promuovere una riflessione originale sul tema del modulo “abitativo” nei luoghi di prossimità delle emergenze territoriali di specifico interesse, attorno alle quali la crescente affluenza di un’utenza fortemente differenziata evidenzia la necessità di nodi singolari che si prestino a soddisfare una molteplicità di usi in contesti di grande pregio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-organisation is increasingly being regarded as an effective approach to tackle modern systems complexity. The self-organisation approach allows the development of systems exhibiting complex dynamics and adapting to environmental perturbations without requiring a complete knowledge of the future surrounding conditions. However, the development of self-organising systems (SOS) is driven by different principles with respect to traditional software engineering. For instance, engineers typically design systems combining smaller elements where the composition rules depend on the reference paradigm, but typically produce predictable results. Conversely, SOS display non-linear dynamics, which can hardly be captured by deterministic models, and, although robust with respect to external perturbations, are quite sensitive to changes on inner working parameters. In this thesis, we describe methodological aspects concerning the early-design stage of SOS built relying on the Multiagent paradigm: in particular, we refer to the A&A metamodel, where MAS are composed by agents and artefacts, i.e. environmental resources. Then, we describe an architectural pattern that has been extracted from a recurrent solution in designing self-organising systems: this pattern is based on a MAS environment formed by artefacts, modelling non-proactive resources, and environmental agents acting on artefacts so as to enable self-organising mechanisms. In this context, we propose a scientific approach for the early design stage of the engineering of self-organising systems: the process is an iterative one and each cycle is articulated in four stages, modelling, simulation, formal verification, and tuning. During the modelling phase we mainly rely on the existence of a self-organising strategy observed in Nature and, hopefully encoded as a design pattern. Simulations of an abstract system model are used to drive design choices until the required quality properties are obtained, thus providing guarantees that the subsequent design steps would lead to a correct implementation. However, system analysis exclusively based on simulation results does not provide sound guarantees for the engineering of complex systems: to this purpose, we envision the application of formal verification techniques, specifically model checking, in order to exactly characterise the system behaviours. During the tuning stage parameters are tweaked in order to meet the target global dynamics and feasibility constraints. In order to evaluate the methodology, we analysed several systems: in this thesis, we only describe three of them, i.e. the most representative ones for each of the three years of PhD course. We analyse each case study using the presented method, and describe the exploited formal tools and techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present study aims at analyzing how dark humour as a cinematic genre travels cross-culturally through a specific mode of audiovisual translation, i.e. dubbing. In particular, it takes into consideration the processes involved in dubbing humour from English into Italian as observed in the English- and Italian-language versions of ten British and American dark comedies from the 1940s to the 2000s. In an attempt to identify some of the main mechanisms of the dark humour genre, the humorous content of the films was analyzed in terms of the elements on which specific scenes are based, mainly the non-verbal and verbal components. In the cases in which verbal elements were involved, i.e. the examples of verbally expressed humour, the analysis was concerned with whether they were adapted into Italian and to what effect. Quantification of the different kinds of dark humour revealed that in the sample of dark comedies verbal dark humour had a higher frequency (85.3%) than non-verbal dark humour (14.7%), which partially disconfirmed the first part of the research hypothesis. However, the significance of contextual elements in the conveying of dark humour, both in the form of Nsp VEH (54.31%) and V-V (V+VE) (21.68%), provided support for the hypothesis that, even when expressed verbally, dark humour is more closely linked to context-based rather than purely linguistic humour (4.9%). The second part of the analysis was concerned with an investigation of the strategies adopted for the translation of verbal dark humour elements from the SL (English) into the TL (Italian) through the filter of dubbing. Four translational strategies were identified as far as the rendering of verbal dark humour is concerned: i) complete omission; ii) weakening; iii) close rendering; and iv) increased effect. Complete omission was found to be the most common among these strategies, with 80.9% of dark humour examples being transposed in a way that kept the ST’s function substantially intact. Weakening of darkly humorous lines was applied in 12% of cases, whereas increased effect accounted for 4.6% and complete omission for 2.5%. The fact that for most examples of Nsp VEH (84.9%) and V-AC (V+VE) (91.4%) a close rendering effect was observed and that 12 out of 21 examples of V-AC (PL) (a combined 57%) were either omitted or weakened seemed to confirm, on the one hand, the complexity of the translation process required by cases of V-AC (PL) and V-AC (CS). On the other hand, as suggested in the second part of the research hypothesis, the data might be interpreted as indicating that lesser effort on the translator/adaptor’s part is involved in the adaptation of V-AC (Nsp VEH) and V-V (V+VE). The issue of the possible censorial intervention undergone by examples of verbal dark humour in the sample still remains unclear.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present thesis has as its object of study a project called “Education towards Peace and No Violence”, developed in the town of Altinópolis, in the inner part of the state of São Paulo, Brazil. Based on the analysis and the studies that were carried out, it was possible to identify the difficulties in the implementation of the project, as well as the positive results evidenced by the reduction of the reports of violence in the town. Through a careful research of the historical facts, it shall be demonstrated the progress of education, particularly in Brazil, throughout the years. In addition, a panoramic view of the studies and programs for peace developed in North America and Occidental Europe will be presented. The education for peace based in the holistic concept of human development appeared as a new educational paradigm that works as a crucial instrument in the construction process towards a peaceful and more human society based in social justice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Se il lavoro dello storico è capire il passato come è stato compreso dalla gente che lo ha vissuto, allora forse non è azzardato pensare che sia anche necessario comunicare i risultati delle ricerche con strumenti propri che appartengono a un'epoca e che influenzano la mentalità di chi in quell'epoca vive. Emergenti tecnologie, specialmente nell’area della multimedialità come la realtà virtuale, permettono agli storici di comunicare l’esperienza del passato in più sensi. In che modo la storia collabora con le tecnologie informatiche soffermandosi sulla possibilità di fare ricostruzioni storiche virtuali, con relativi esempi e recensioni? Quello che maggiormente preoccupa gli storici è se una ricostruzione di un fatto passato vissuto attraverso la sua ricreazione in pixels sia un metodo di conoscenza della storia che possa essere considerato valido. Ovvero l'emozione che la navigazione in una realtà 3D può suscitare, è un mezzo in grado di trasmettere conoscenza? O forse l'idea che abbiamo del passato e del suo studio viene sottilmente cambiato nel momento in cui lo si divulga attraverso la grafica 3D? Da tempo però la disciplina ha cominciato a fare i conti con questa situazione, costretta soprattutto dall'invasività di questo tipo di media, dalla spettacolarizzazione del passato e da una divulgazione del passato parziale e antiscientifica. In un mondo post letterario bisogna cominciare a pensare che la cultura visuale nella quale siamo immersi sta cambiando il nostro rapporto con il passato: non per questo le conoscenze maturate fino ad oggi sono false, ma è necessario riconoscere che esiste più di una verità storica, a volte scritta a volte visuale. Il computer è diventato una piattaforma onnipresente per la rappresentazione e diffusione dell’informazione. I metodi di interazione e rappresentazione stanno evolvendo di continuo. Ed è su questi due binari che è si muove l’offerta delle tecnologie informatiche al servizio della storia. Lo scopo di questa tesi è proprio quello di esplorare, attraverso l’utilizzo e la sperimentazione di diversi strumenti e tecnologie informatiche, come si può raccontare efficacemente il passato attraverso oggetti tridimensionali e gli ambienti virtuali, e come, nel loro essere elementi caratterizzanti di comunicazione, in che modo possono collaborare, in questo caso particolare, con la disciplina storica. La presente ricerca ricostruisce alcune linee di storia delle principali fabbriche attive a Torino durante la seconda guerra mondiale, ricordando stretta relazione che esiste tra strutture ed individui e in questa città in particolare tra fabbrica e movimento operaio, è inevitabile addentrarsi nelle vicende del movimento operaio torinese che nel periodo della lotta di Liberazione in città fu un soggetto politico e sociale di primo rilievo. Nella città, intesa come entità biologica coinvolta nella guerra, la fabbrica (o le fabbriche) diventa il nucleo concettuale attraverso il quale leggere la città: sono le fabbriche gli obiettivi principali dei bombardamenti ed è nelle fabbriche che si combatte una guerra di liberazione tra classe operaia e autorità, di fabbrica e cittadine. La fabbrica diventa il luogo di "usurpazione del potere" di cui parla Weber, il palcoscenico in cui si tengono i diversi episodi della guerra: scioperi, deportazioni, occupazioni .... Il modello della città qui rappresentata non è una semplice visualizzazione ma un sistema informativo dove la realtà modellata è rappresentata da oggetti, che fanno da teatro allo svolgimento di avvenimenti con una precisa collocazione cronologica, al cui interno è possibile effettuare operazioni di selezione di render statici (immagini), di filmati precalcolati (animazioni) e di scenari navigabili interattivamente oltre ad attività di ricerca di fonti bibliografiche e commenti di studiosi segnatamente legati all'evento in oggetto. Obiettivo di questo lavoro è far interagire, attraverso diversi progetti, le discipline storiche e l’informatica, nelle diverse opportunità tecnologiche che questa presenta. Le possibilità di ricostruzione offerte dal 3D vengono così messe a servizio della ricerca, offrendo una visione integrale in grado di avvicinarci alla realtà dell’epoca presa in considerazione e convogliando in un’unica piattaforma espositiva tutti i risultati. Divulgazione Progetto Mappa Informativa Multimediale Torino 1945 Sul piano pratico il progetto prevede una interfaccia navigabile (tecnologia Flash) che rappresenti la pianta della città dell’epoca, attraverso la quale sia possibile avere una visione dei luoghi e dei tempi in cui la Liberazione prese forma, sia a livello concettuale, sia a livello pratico. Questo intreccio di coordinate nello spazio e nel tempo non solo migliora la comprensione dei fenomeni, ma crea un maggiore interesse sull’argomento attraverso l’utilizzo di strumenti divulgativi di grande efficacia (e appeal) senza perdere di vista la necessità di valicare le tesi storiche proponendosi come piattaforma didattica. Un tale contesto richiede uno studio approfondito degli eventi storici al fine di ricostruire con chiarezza una mappa della città che sia precisa sia topograficamente sia a livello di navigazione multimediale. La preparazione della cartina deve seguire gli standard del momento, perciò le soluzioni informatiche utilizzate sono quelle fornite da Adobe Illustrator per la realizzazione della topografia, e da Macromedia Flash per la creazione di un’interfaccia di navigazione. La base dei dati descrittivi è ovviamente consultabile essendo contenuta nel supporto media e totalmente annotata nella bibliografia. È il continuo evolvere delle tecnologie d'informazione e la massiccia diffusione dell’uso dei computer che ci porta a un cambiamento sostanziale nello studio e nell’apprendimento storico; le strutture accademiche e gli operatori economici hanno fatto propria la richiesta che giunge dall'utenza (insegnanti, studenti, operatori dei Beni Culturali) di una maggiore diffusione della conoscenza storica attraverso la sua rappresentazione informatizzata. Sul fronte didattico la ricostruzione di una realtà storica attraverso strumenti informatici consente anche ai non-storici di toccare con mano quelle che sono le problematiche della ricerca quali fonti mancanti, buchi della cronologia e valutazione della veridicità dei fatti attraverso prove. Le tecnologie informatiche permettono una visione completa, unitaria ed esauriente del passato, convogliando tutte le informazioni su un'unica piattaforma, permettendo anche a chi non è specializzato di comprendere immediatamente di cosa si parla. Il miglior libro di storia, per sua natura, non può farlo in quanto divide e organizza le notizie in modo diverso. In questo modo agli studenti viene data l'opportunità di apprendere tramite una rappresentazione diversa rispetto a quelle a cui sono abituati. La premessa centrale del progetto è che i risultati nell'apprendimento degli studenti possono essere migliorati se un concetto o un contenuto viene comunicato attraverso più canali di espressione, nel nostro caso attraverso un testo, immagini e un oggetto multimediale. Didattica La Conceria Fiorio è uno dei luoghi-simbolo della Resistenza torinese. Il progetto è una ricostruzione in realtà virtuale della Conceria Fiorio di Torino. La ricostruzione serve a arricchire la cultura storica sia a chi la produce, attraverso una ricerca accurata delle fonti, sia a chi può poi usufruirne, soprattutto i giovani, che, attratti dall’aspetto ludico della ricostruzione, apprendono con più facilità. La costruzione di un manufatto in 3D fornisce agli studenti le basi per riconoscere ed esprimere la giusta relazione fra il modello e l’oggetto storico. Le fasi di lavoro attraverso cui si è giunti alla ricostruzione in 3D della Conceria: . una ricerca storica approfondita, basata sulle fonti, che possono essere documenti degli archivi o scavi archeologici, fonti iconografiche, cartografiche, ecc.; . La modellazione degli edifici sulla base delle ricerche storiche, per fornire la struttura geometrica poligonale che permetta la navigazione tridimensionale; . La realizzazione, attraverso gli strumenti della computer graphic della navigazione in 3D. Unreal Technology è il nome dato al motore grafico utilizzato in numerosi videogiochi commerciali. Una delle caratteristiche fondamentali di tale prodotto è quella di avere uno strumento chiamato Unreal editor con cui è possibile costruire mondi virtuali, e che è quello utilizzato per questo progetto. UnrealEd (Ued) è il software per creare livelli per Unreal e i giochi basati sul motore di Unreal. E’ stata utilizzata la versione gratuita dell’editor. Il risultato finale del progetto è un ambiente virtuale navigabile raffigurante una ricostruzione accurata della Conceria Fiorio ai tempi della Resistenza. L’utente può visitare l’edificio e visualizzare informazioni specifiche su alcuni punti di interesse. La navigazione viene effettuata in prima persona, un processo di “spettacolarizzazione” degli ambienti visitati attraverso un arredamento consono permette all'utente una maggiore immersività rendendo l’ambiente più credibile e immediatamente codificabile. L’architettura Unreal Technology ha permesso di ottenere un buon risultato in un tempo brevissimo, senza che fossero necessari interventi di programmazione. Questo motore è, quindi, particolarmente adatto alla realizzazione rapida di prototipi di una discreta qualità, La presenza di un certo numero di bug lo rende, però, in parte inaffidabile. Utilizzare un editor da videogame per questa ricostruzione auspica la possibilità di un suo impiego nella didattica, quello che le simulazioni in 3D permettono nel caso specifico è di permettere agli studenti di sperimentare il lavoro della ricostruzione storica, con tutti i problemi che lo storico deve affrontare nel ricreare il passato. Questo lavoro vuole essere per gli storici una esperienza nella direzione della creazione di un repertorio espressivo più ampio, che includa gli ambienti tridimensionali. Il rischio di impiegare del tempo per imparare come funziona questa tecnologia per generare spazi virtuali rende scettici quanti si impegnano nell'insegnamento, ma le esperienze di progetti sviluppati, soprattutto all’estero, servono a capire che sono un buon investimento. Il fatto che una software house, che crea un videogame di grande successo di pubblico, includa nel suo prodotto, una serie di strumenti che consentano all'utente la creazione di mondi propri in cui giocare, è sintomatico che l'alfabetizzazione informatica degli utenti medi sta crescendo sempre più rapidamente e che l'utilizzo di un editor come Unreal Engine sarà in futuro una attività alla portata di un pubblico sempre più vasto. Questo ci mette nelle condizioni di progettare moduli di insegnamento più immersivi, in cui l'esperienza della ricerca e della ricostruzione del passato si intreccino con lo studio più tradizionale degli avvenimenti di una certa epoca. I mondi virtuali interattivi vengono spesso definiti come la forma culturale chiave del XXI secolo, come il cinema lo è stato per il XX. Lo scopo di questo lavoro è stato quello di suggerire che vi sono grosse opportunità per gli storici impiegando gli oggetti e le ambientazioni in 3D, e che essi devono coglierle. Si consideri il fatto che l’estetica abbia un effetto sull’epistemologia. O almeno sulla forma che i risultati delle ricerche storiche assumono nel momento in cui devono essere diffuse. Un’analisi storica fatta in maniera superficiale o con presupposti errati può comunque essere diffusa e avere credito in numerosi ambienti se diffusa con mezzi accattivanti e moderni. Ecco perchè non conviene seppellire un buon lavoro in qualche biblioteca, in attesa che qualcuno lo scopra. Ecco perchè gli storici non devono ignorare il 3D. La nostra capacità, come studiosi e studenti, di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio che il 3D porta con sè, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Una ricostruzione storica può essere molto utile dal punto di vista educativo non sono da chi la visita ma, anche da chi la realizza. La fase di ricerca necessaria per la ricostruzione non può fare altro che aumentare il background culturale dello sviluppatore. Conclusioni La cosa più importante è stata la possibilità di fare esperienze nell’uso di mezzi di comunicazione di questo genere per raccontare e far conoscere il passato. Rovesciando il paradigma conoscitivo che avevo appreso negli studi umanistici, ho cercato di desumere quelle che potremo chiamare “leggi universali” dai dati oggettivi emersi da questi esperimenti. Da punto di vista epistemologico l’informatica, con la sua capacità di gestire masse impressionanti di dati, dà agli studiosi la possibilità di formulare delle ipotesi e poi accertarle o smentirle tramite ricostruzioni e simulazioni. Il mio lavoro è andato in questa direzione, cercando conoscere e usare strumenti attuali che nel futuro avranno sempre maggiore presenza nella comunicazione (anche scientifica) e che sono i mezzi di comunicazione d’eccellenza per determinate fasce d’età (adolescenti). Volendo spingere all’estremo i termini possiamo dire che la sfida che oggi la cultura visuale pone ai metodi tradizionali del fare storia è la stessa che Erodoto e Tucidide contrapposero ai narratori di miti e leggende. Prima di Erodoto esisteva il mito, che era un mezzo perfettamente adeguato per raccontare e dare significato al passato di una tribù o di una città. In un mondo post letterario la nostra conoscenza del passato sta sottilmente mutando nel momento in cui lo vediamo rappresentato da pixel o quando le informazioni scaturiscono non da sole, ma grazie all’interattività con il mezzo. La nostra capacità come studiosi e studenti di percepire idee ed orientamenti importanti dipende spesso dai metodi che impieghiamo per rappresentare i dati e l’evidenza. Perché gli storici possano ottenere il beneficio sottinteso al 3D, tuttavia, devono sviluppare un’agenda di ricerca volta ad accertarsi che il 3D sostenga i loro obiettivi di ricercatori e insegnanti. Le esperienze raccolte nelle pagine precedenti ci portano a pensare che in un futuro non troppo lontano uno strumento come il computer sarà l’unico mezzo attraverso cui trasmettere conoscenze, e dal punto di vista didattico la sua interattività consente coinvolgimento negli studenti come nessun altro mezzo di comunicazione moderno.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.