13 resultados para fragmentation pattern
em AMS Tesi di Dottorato - Alm@DL - Università di Bologna
Resumo:
This thesis proposes a new document model, according to which any document can be segmented in some independent components and transformed in a pattern-based projection, that only uses a very small set of objects and composition rules. The point is that such a normalized document expresses the same fundamental information of the original one, in a simple, clear and unambiguous way. The central part of my work consists of discussing that model, investigating how a digital document can be segmented, and how a segmented version can be used to implement advanced tools of conversion. I present seven patterns which are versatile enough to capture the most relevant documents’ structures, and whose minimality and rigour make that implementation possible. The abstract model is then instantiated into an actual markup language, called IML. IML is a general and extensible language, which basically adopts an XHTML syntax, able to capture a posteriori the only content of a digital document. It is compared with other languages and proposals, in order to clarify its role and objectives. Finally, I present some systems built upon these ideas. These applications are evaluated in terms of users’ advantages, workflow improvements and impact over the overall quality of the output. In particular, they cover heterogeneous content management processes: from web editing to collaboration (IsaWiki and WikiFactory), from e-learning (IsaLearning) to professional printing (IsaPress).
Resumo:
One of the problems in the analysis of nucleus-nucleus collisions is to get information on the value of the impact parameter b. This work consists in the application of pattern recognition techniques aimed at associating values of b to groups of events. To this end, a support vec- tor machine (SVM) classifier is adopted to analyze multifragmentation reactions. This method allows to backtracing the values of b through a particular multidimensional analysis. The SVM classification con- sists of two main phase. In the first one, known as training phase, the classifier learns to discriminate the events that are generated by two different model:Classical Molecular Dynamics (CMD) and Heavy- Ion Phase-Space Exploration (HIPSE) for the reaction: 58Ni +48 Ca at 25 AMeV. To check the classification of events in the second one, known as test phase, what has been learned is tested on new events generated by the same models. These new results have been com- pared to the ones obtained through others techniques of backtracing the impact parameter. Our tests show that, following this approach, the central collisions and peripheral collisions, for the CMD events, are always better classified with respect to the classification by the others techniques of backtracing. We have finally performed the SVM classification on the experimental data measured by NUCL-EX col- laboration with CHIMERA apparatus for the previous reaction.
Resumo:
This research argues for an analysis of textual and cultural forms in the American horror film (1968- 1998), by defining the so-called postmodern characters. The “postmodern” term will not mean a period of the history of cinema, but a series of forms and strategies recognizable in many American films. From a bipolar re-mediation and cognitive point of view, the postmodern phenomenon is been considered as a formal and epistemological re-configuration of the cultural “modern” system. The first section of the work examines theoretical problems about the “postmodern phenomenon” by defining its cultural and formal constants in different areas (epistemology, economy, mass-media): the character of convergence, fragmentation, manipulation and immersion represent the first ones, while the “excess” is the morphology of the change, by realizing the “fluctuation” of the previous consolidated system. The second section classifies the textual and cultural forms of American postmodern film, generally non-horror. The “classic narrative” structure – coherent and consequent chain of causal cues toward a conclusion – is scattered by the postmodern constant of “fragmentation”. New textual models arise, fragmenting the narrative ones into the aggregations of data without causal-temporal logics. Considering the process of “transcoding”1 and “remediation”2 between media, and the principle of “convergence” in the phenomenon, the essay aims to define these structures in postmodern film as “database forms” and “navigable space forms.” The third section applies this classification to American horror film (1968-1998). The formal constant of “excess” in the horror genre works on the paradigm of “vision”: if postmodern film shows a crisis of the “truth” in the vision, in horror movies the excess of vision becomes “hyper-vision” – that is “multiplication” of the death/blood/torture visions – and “intra-vision”, that shows the impossibility of recognizing the “real” vision from the virtual/imaginary. In this perspective, the textual and cultural forms and strategies of postmodern horror film are predominantly: the “database-accumulation” forms, where the events result from a very simple “remote cause” serving as a pretext (like in Night of the Living Dead); the “database-catalogue” forms, where the events follow one another displaying a “central” character or theme. In the first case, the catalogue syntagms are connected by “consecutive” elements, building stories linked by the actions of a single character (usually the killer), or connected by non-consecutive episodes about a general theme: examples of the first kind are built on the model of The Wizard of Gore; the second ones, on the films such as Mario Bava’s I tre volti della paura. The “navigable space” forms are defined: hyperlink a, where one universe is fluctuating between reality and dream, as in Rosemary’s Baby; hyperlink b (where two non-hierarchical universes are convergent, the first one real and the other one fictional, as in the Nightmare series); hyperlink c (where more worlds are separated but contiguous in the last sequence, as in Targets); the last form, navigable-loop, includes a textual line which suddenly stops and starts again, reflecting the pattern of a “loop” (as in Lost Highway). This essay analyses in detail the organization of “visual space” into the postmodern horror film by tracing representative patterns. It concludes by examining the “convergence”3 of technologies and cognitive structures of cinema and new media.
Resumo:
Introduction: Apoptotic cell death of cardiomyocytes is involved in several cardiovascular diseases including ischemia, hypertrophy and heart failure, thus representing a potential therapeutic target. Apoptosis of cardiac cells can be induced experimentally by several stimuli including hypoxia, serum withdrawal or combination of both. Several lines of research suggest that neurohormonal mechanisms play a central role in the progression of heart failure. In particular, excessive activation of the sympathetic nervous system or the renin-angiotensin-aldosterone system is known to have deleterious effects on the heart. Recent studies report that norepinephrine (NE), the primary transmitter of sympathetic nervous system, and aldosterone (ALD), which is actively produced in failing human heart, are able to induce apoptosis of rat cardiomyocytes. Polyamines are biogenic amines involved in many cellular processes, including apoptosis. Actually it appears that these molecules can act as promoting, modulating or protective agents in apoptosis depending on apoptotic stimulus and cellular model. We have studied the involvement of polyamines in the apoptosis of cardiac cells induced in a model of simulated ischemia and following treatment with NE or ALD. Methods: H9c2 cardiomyoblasts were exposed to a condition of simulated ischemia, consisting of hypoxia plus serum deprivation. Cardiomyocyte cultures were prepared from 1-3 day-old neonatal Wistar rat hearts. Polyamine depletion was obtained by culturing the cells in the presence of α-difluoromethylornithine (DFMO). Polyamines were separated and quantified in acidic cellular extracts by HPLC after derivatization with dansyl chloride. Caspase activity was measured by the cleavage of the fluorogenic peptide substrate. Ornithine decarboxylase (ODC) activity was measured by estimation of the release of 14C-CO2 from 14C-ornithine. DNA fragmentation was visualized by the method of terminal transferase-mediated dUTP nick end-labeling (TUNEL), and DNA laddering on agarose gel electophoresis. Cytochrome c was detected by immunoflorescent staining. Activation of signal transduction pathways was investigated by western blotting. Results: The results indicate that simulated ischemia, NE and ALD cause an early induction of the activity of ornithine decarboxylase (ODC), the first enzyme in polyamine biosynthesis, followed by a later increase of caspase activity, a family of proteases that execute the death program and induce cell death. This effect was prevented in the presence of DFMO, an irreversible inhibitor of ODC, thus suggesting that polyamines are involved in the execution of the death program activated by these stimuli. In H9c2 cells DFMO inhibits several molecular events related to apoptosis that follow simulated ischemia, such as the release of cytochrome c from mitochondria, down-regulation of Bcl-xL, and DNA fragmentation. The anti-apoptotic protein survivin is down-regulated after ALD or NE treatement and polyamine depletion obtained by DFMO partially opposes survivin decrease. Moreover, a study of key signal transduction pathways governing cell death and survival, revealed an involvement of AMP activated protein kinase (AMPK) and AKT kinase, in the modulation by polyamines of the response of cardiomyocytes to NE. In fact polyamine depleted cells show an altered pattern of AMPK and AKT activation that may contrast apoptosis and appears to result from a differential effect on the specific phosphatases that dephosphorylate and switch off these signaling proteins. Conclusions: These results indicate that polyamines are involved in the execution of the death program activated in cardiac cells by heart failure-related stimuli, like ischemia, ALD and NE, and suggest that their apoptosis facilitating action is mediated by a network of specific phosphatases and kinases.
Resumo:
Customer satisfaction has been traditionally studied and measured regardless of the time elapsed since the purchase. Some studies have recently reopened the debate about the temporal pattern of satisfaction. This research aims to explain why “how you evaluate a service depends on when you evaluate it” on the basis of the theoretical framework proposed by Construal-Level Theory (CLT). Although an empirical investigation is still lacking, the literature does not deny that CLT can be applied also with regard to past events. Moreover, some studies support the idea that satisfaction is a good predictor of future intentions, while others do not. On the basis of CLT, we argue that these inconsistent results are due to the different construal levels of the information pertaining to retrospective and prospective evaluations. Building on the Two-Factor Theory, we explain the persistence of certain attributes’ representations over time according to their relationship with overall performance. We present and discuss three experiments and one field study that were conducted a) to test the extensibility of CLT to past events, b) to disentangle memory and construal effects, c) to study the effect of different temporal perspective on overall satisfaction judgements, and d) to investigate the temporal shift of the determinants of customer satisfaction as a function of temporal distance.
Resumo:
The reactions 32S+58,64Ni are studied at 14.5 AMeV. From this energy on, fragmentation begins to be a dominant process, although evaporation and fission are still present. After a selection of the collision mechanism, we show that important even-odd effects are present in the isotopic fragment distributions when the excitation energy is small. The staggering effect appears to be a universal feature of fragment production, slightly enhanced when the emission source is neutron poor. A closer look at the behavior of isotopic chains reveals that odd-even effects cannot be explained by pairing effects in the nuclear mass alone, but depend in a more complex way on the de-excitation chain.
Resumo:
Background/Objectives: Sleep has been shown to enhance creativity, but the reason for this enhancement is not entirely known. There are several different physiological states associated with sleep. In addition to rapid (REM) and non-rapid eye movement (NREM) sleep, NREM sleep can be broken down into Stages (1-4) that are characterized by the degree of EEG slow wave activity. In addition, during NREM sleep there are transient but cyclic alternating patterns (CAP) of EEG activity and these CAPs can also be divided into three subtypes (A1-A3) according to speed of the EEG waves. Differences in CAP ratios have been previously linked to cognitive performances. The purpose of this study was to learn the relationship CAP activity during sleep and creativity. Methods: The participants were 8 healthy young adults (4 women), who underwent 3 consecutive nights of polysomnographic recording and took the Abbreviated Torrance Test for Adults (ATTA) on the 2 and 3rd mornings after the recordings. Results: There were positive correlations between Stage 1 of NREM sleep and some measures of creativity such as fluency (R= .797; p=.029) and flexibility ( R=.43; p=.002), between Stage 4 of Non-REM sleep and originality (R= .779; p=.034) and a global measure of figural creativity (R= .758; p=.040). There was also a negative correlation between REM sleep and originality (R= -.827; p= .042) . During NREM sleep the CAP rate, which in young people is primarily the A1 subtype, also correlated with originality (R= .765; p =.038). Conclusions: NREM sleep is associated with low levels of cortical arousal and low cortical arousal may enhance the ability of people to access to the remote associations that are critical for creative innovations. In addition, A1 CAP activity reflects frontal activity and the frontal lobes are important for divergent thinking, also a critical aspect of creativity.
Resumo:
Several studies showed that sleep loss/fragmentation may have a negative impact on cognitive performance, mood and autonomic activity. Specific neurocognitive domains, such as executive function (i.e.,prefrontal cortex), seems to be particularly vulnerable to sleep loss. Pearson et al.(2006) evaluated 16 RLS patients compared to controls by cognitive tests, including those particularly sensitive to prefrontal cortical (PFC) functioning and sleep loss. RLS patients showed significant deficits on two of the three PFC tests. It has been recently reported that RLS is associated with psychiatric manifestations. A high prevalence of depressive symptoms has been found in patients with RLS(Rothdach AJ et al., 2000). RLS could cause depression through its adverse influences on sleep and energy. On the other hand, symptoms of depression such as sleep deprivation, poor nutrition or lack of exercise may predispose an individual to the development of RLS. Moreover, depressed patients may amplify mild RLS, making occasional RLS symptoms appear to meet threshold criteria. The specific treatment of depression could be also implicated, since antidepressant compounds may worsen RLS and PLMD(Picchietti D et al., 2005; Damsa C et al., 2004). Interestingly, treatments used to relieve RLS symptoms (dopamine agonists) seem to have an antidepressant effects in RLS depressed patients(Saletu M et al., 2002&2003). During normal sleep there is a well-regulated pattern of the autonomic function, modulated by changes in sleep stages. It has been reported that chronic sleep deprivation is associated with cardiovascular events. In patients with sleep fragmentation increased number of arousals and increased cyclic alternating pattern rate is associated with an increase in sympathetic activity. It has been demonstrated that PLMS occurrence is associated with a shift to increased sympathetic activity without significant changes in cardiac parasympathetic activity (Sforza E et al., 2005). An increased association of RLS with hypertension and heart disease has been documented in several studies(Ulfberg J et al., 2001; Ohayon MM et al., 2002).
Resumo:
Carbon fluxes and allocation pattern, and their relationship with the main environmental and physiological parameters, were studied in an apple orchard for one year (2010). I combined three widely used methods: eddy covariance, soil respiration and biometric measurements, and I applied a measurement protocol allowing a cross-check between C fluxes estimated using different methods. I attributed NPP components to standing biomass increment, detritus cycle and lateral export. The influence of environmental and physiological parameters on NEE, GPP and Reco was analyzed with a multiple regression model approach. I found that both NEP and GPP of the apple orchard were of similar magnitude to those of forests growing in similar climate conditions, while large differences occurred in the allocation pattern and in the fate of produced biomass. Apple production accounted for 49% of annual NPP, organic material (leaves, fine root litter, pruned wood and early fruit drop) contributing to detritus cycle was 46%, and only 5% went to standing biomass increment. The carbon use efficiency (CUE), with an annual average of 0.68 ± 0.10, was higher than the previously suggested constant values of 0.47-0.50. Light and leaf area index had the strongest influence on both NEE and GPP. On a diurnal basis, NEE and GPP reached their peak approximately at noon, while they appeared to be limited by high values of VPD and air temperature in the afternoon. The proposed models can be used to explain and simulate current relations between carbon fluxes and environmental parameters at daily and yearly time scale. On average, the annual NEP balanced the carbon annually exported with the harvested apples. These data support the hypothesis of a minimal or null impact of the apple orchard ecosystem on net C emission to the atmosphere.
Resumo:
Al centro della seguente ricerca c'è il Piacentino, ossia il territorio che faceva capo alla città di Placentia in età altomedievale, più precisamente tra VII e IX secolo. L'aspetto su cui è concentrata l’analisi è quello dell’insediamento, rapportando le forme insediative alle comunità rurali e alle strutture del potere. Il Piacentino costituisce un contesto particolare, sia per la vicinanza alla capitale Pavia, sia per la presenza di enti patrimoniali molto importanti, quale il monastero di Bobbio, sia per l'estensione di vallate appenniniche che ne hanno protetto l'isolamento e impedito, fino al secolo IX inoltrato, l'insediamento di una forte aristocrazia. Ciò che sembra dedursi dall’analisi della documentazione (informatizzata) di VIII, ma soprattutto di IX secolo, è che la civitas di Piacenza, sede del conte e centro della diocesi, non costituiva di fatto il punto di riferimento per gli abitanti dell’intero territorio rurale. Dal punto di vista del possesso fondiario molti insediamenti erano autonomi. Ciò vale soprattutto per i siti di alcune zone collinari e per quelli dell’Appennino, mentre il discorso è da porsi in modo più sfumato per la pianura e per la zona dei prata vel campanea Placentina, ubicati immediatamente al di fuori della civitas. Lo studio della media e grande proprietà fondiaria ha permesso di registrare la frammentarietà dei possedimenti, ossia la loro dispersione nel sistema insediativo del comitato, la loro ubicazione e l’articolato intreccio proprietario, che in alcune zone sembra avere permesso l’esistenza di comunità locali piuttosto forti. L’attento esame della documentazione e il sicuro rimando alla bibliografia di riferimento hanno portato a risultati originali e significativi.
Resumo:
In this work, we have considered the theme of landscape in the poetry of Andrea Zanzotto, Philippe Jaccottet and Seamus Heaney within the perspective of a fragmentation of the aesthetics of nature. To that end, the most advanced theories of aesthetics applied to nature, such as environmental Aesthetics and Aesthetik der Natur (also known as ökologische Aesthetik) have been taken into account. The philosophical perspective of Paolo D’Angelo, insights from geography (in particular from the works of Franco Farinelli) and from ecology (considering the contributions of Gilles Clément to this discipline) have also been useful. We have argued that the poetic experiences of Zanzotto, Jaccottet and Heaney follow a similar path, each starting from the fusion between the poetic subject and landscape to reach a two-way relationship between them. In this interpretation, the concept of landscape has been considered, according to Michel Collot’s theory of pensée-paysage, as a phenomenon. The poetic texts have been analysed under the lenses of linguistic, stylistic and rhetorical approaches, consistent with the idea that every text must be studied within its context, as every poetic experience is constituted of three elements: the poetic subject, his language and his world, the latest being shaped by and shaping the subject’s position and the perspectives related to it: that is his discourse to the world and in this world.
Resumo:
This research was designed to answer the question of which direction the restructuring of financial regulators should take – consolidation or fragmentation. This research began by examining the need for financial regulation and its related costs. It then continued to describe what types of regulatory structures exist in the world; surveying the regulatory structures in 15 jurisdictions, comparing them and discussing their strengths and weaknesses. This research analyzed the possible regulatory structures using three methodological tools: Game-Theory, Institutional-Design, and Network-Effects. The incentives for regulatory action were examined in Chapter Four using game theory concepts. This chapter predicted how two regulators with overlapping supervisory mandates will behave in two different states of the world (where they can stand to benefit from regulating and where they stand to lose). The insights derived from the games described in this chapter were then used to analyze the different supervisory models that exist in the world. The problem of information-flow was discussed in Chapter Five using tools from institutional design. The idea is based on the need for the right kind of information to reach the hands of the decision maker in the shortest time possible in order to predict, mitigate or stop a financial crisis from occurring. Network effects and congestion in the context of financial regulation were discussed in Chapter Six which applied the literature referring to network effects in general in an attempt to conclude whether consolidating financial regulatory standards on a global level might also yield other positive network effects. Returning to the main research question, this research concluded that in general the fragmented model should be preferable to the consolidated model in most cases as it allows for greater diversity and information-flow. However, in cases in which close cooperation between two authorities is essential, the consolidated model should be used.