886 resultados para Recontextualised found object
Resumo:
We investigated the effects of texture gradient and the position of test stimulus in relation to the horizon on the perception of relative sizes. By using the staircase method, 50 participants adjusted the size of a bar presented above, below or on the horizon as it could be perceived in the same size of a bar presented in the lower visual field. Stimuli were presented during 100ms on five background conditions. Perspective gradient contributed more to the overestimation of relative sizes than compression gradient. The sizes of the objects which intercepted the horizon line were overestimated. Visual system was very effective in extracting information from perspective depth cues, making it even during very brief exposure.
Resumo:
Abstract Background Catching an object is a complex movement that involves not only programming but also effective motor coordination. Such behavior is related to the activation and recruitment of cortical regions that participates in the sensorimotor integration process. This study aimed to elucidate the cortical mechanisms involved in anticipatory actions when performing a task of catching an object in free fall. Methods Quantitative electroencephalography (qEEG) was recorded using a 20-channel EEG system in 20 healthy right-handed participants performed the catching ball task. We used the EEG coherence analysis to investigate subdivisions of alpha (8-12 Hz) and beta (12-30 Hz) bands, which are related to cognitive processing and sensory-motor integration. Results Notwithstanding, we found the main effects for the factor block; for alpha-1, coherence decreased from the first to sixth block, and the opposite effect occurred for alpha-2 and beta-2, with coherence increasing along the blocks. Conclusion It was concluded that to perform successfully our task, which involved anticipatory processes (i.e. feedback mechanisms), subjects exhibited a great involvement of sensory-motor and associative areas, possibly due to organization of information to process visuospatial parameters and further catch the falling object.
Resumo:
Tesis en inglés. Eliminadas las páginas en blanco del pdf
Resumo:
[EN] In this paper, we present a vascular tree model made with synthetic materials and which allows us to obtain images to make a 3D reconstruction.We have used PVC tubes of several diameters and lengths that will let us evaluate the accuracy of our 3D reconstruction. In order to calibrate the camera we have used a corner detector. Also we have used Optical Flow techniques to follow the points through the images going and going back. We describe two general techniques to extract a sequence of corresponding points from multiple views of an object. The resulting sequence of points will be used later to reconstruct a set of 3D points representing the object surfaces on the scene. We have made the 3D reconstruction choosing by chance a couple of images and we have calculated the projection error. After several repetitions, we have found the best 3D location for the point.
Resumo:
Quasars and AGN play an important role in many aspects of the modern cosmology. Of particular interest is the issue of the interplay between AGN activity and formation and evolution of galaxies and structures. Studies on nearby galaxies revealed that most (and possibly all) galaxy nuclei contain a super-massive black hole (SMBH) and that between a third and half of them are showing some evidence of activity (Kormendy and Richstone, 1995). The discovery of a tight relation between black holes mass and velocity dispersion of their host galaxy suggests that the evolution of the growth of SMBH and their host galaxy are linked together. In this context, studying the evolution of AGN, through the luminosity function (LF), is fundamental to constrain the theories of galaxy and SMBH formation and evolution. Recently, many theories have been developed to describe physical processes possibly responsible of a common formation scenario for galaxies and their central black hole (Volonteri et al., 2003; Springel et al., 2005a; Vittorini et al., 2005; Hopkins et al., 2006a) and an increasing number of observations in different bands are focused on collecting larger and larger quasar samples. Many issues remain however not yet fully understood. In the context of the VVDS (VIMOS-VLT Deep Survey), we collected and studied an unbiased sample of spectroscopically selected faint type-1 AGN with a unique and straightforward selection function. Indeed, the VVDS is a large, purely magnitude limited spectroscopic survey of faint objects, free of any morphological and/or color preselection. We studied the statistical properties of this sample and its evolution up to redshift z 4. Because of the contamination of the AGN light by their host galaxies at the faint magnitudes explored by our sample, we observed that a significant fraction of AGN in our sample would be missed by the UV excess and morphological criteria usually adopted for the pre-selection of optical QSO candidates. If not properly taken into account, this failure in selecting particular sub-classes of AGN could, in principle, affect some of the conclusions drawn from samples of AGN based on these selection criteria. The absence of any pre-selection in the VVDS leads us to have a very complete sample of AGN, including also objects with unusual colors and continuum shape. The VVDS AGN sample shows in fact redder colors than those expected by comparing it, for example, with the color track derived from the SDSS composite spectrum. In particular, the faintest objects have on average redder colors than the brightest ones. This can be attributed to both a large fraction of dust-reddened objects and a significant contamination from the host galaxy. We have tested these possibilities by examining the global spectral energy distribution of each object using, in addition to the U, B, V, R and I-band magnitudes, also the UV-Galex and the IR-Spitzer bands, and fitting it with a combination of AGN and galaxy emission, allowing also for the possibility of extinction of the AGN flux. We found that for 44% of our objects the contamination from the host galaxy is not negligible and this fraction decreases to 21% if we restrict the analysis to a bright subsample (M1450 <-22.15). Our estimated integral surface density at IAB < 24.0 is 500 AGN per square degree, which represents the highest surface density of a spectroscopically confirmed sample of optically selected AGN. We derived the luminosity function in B-band for 1.0 < z < 3.6 using the 1/Vmax estimator. Our data, more than one magnitude fainter than previous optical surveys, allow us to constrain the faint part of the luminosity function up to high redshift. A comparison of our data with the 2dF sample at low redshift (1 < z < 2.1) shows that the VDDS data can not be well fitted with the pure luminosity evolution (PLE) models derived by previous optically selected samples. Qualitatively, this appears to be due to the fact that our data suggest the presence of an excess of faint objects at low redshift (1.0 < z < 1.5) with respect to these models. By combining our faint VVDS sample with the large sample of bright AGN extracted from the SDSS DR3 (Richards et al., 2006b) and testing a number of different evolutionary models, we find that the model which better represents the combined luminosity functions, over a wide range of redshift and luminosity, is a luminosity dependent density evolution (LDDE) model, similar to those derived from the major Xsurveys. Such a parameterization allows the redshift of the AGN density peak to change as a function of luminosity, thus fitting the excess of faint AGN that we find at 1.0 < z < 1.5. On the basis of this model we find, for the first time from the analysis of optically selected samples, that the peak of the AGN space density shifts significantly towards lower redshift going to lower luminosity objects. The position of this peak moves from z 2.0 for MB <-26.0 to z 0.65 for -22< MB <-20. This result, already found in a number of X-ray selected samples of AGN, is consistent with a scenario of “AGN cosmic downsizing”, in which the density of more luminous AGN, possibly associated to more massive black holes, peaks earlier in the history of the Universe (i.e. at higher redshift), than that of low luminosity ones, which reaches its maximum later (i.e. at lower redshift). This behavior has since long been claimed to be present in elliptical galaxies and it is not easy to reproduce it in the hierarchical cosmogonic scenario, where more massive Dark Matter Halos (DMH) form on average later by merging of less massive halos.
Resumo:
This Thesis is devoted to the study of the optical companions of Millisecond Pulsars in Galactic Globular Clusters (GCs) as a part of a large project started at the Department of Astronomy of the Bologna University, in collaboration with other institutions (Astronomical Observatory of Cagliari and Bologna, University of Virginia), specifically dedicated to the study of the environmental effects on passive stellar evolution in galactic GCs. Globular Clusters are very efficient “Kilns” for generating exotic object, such as Millisecond Pulsars (MSP), low mass X-ray binaries(LMXB) or Blue Straggler Stars (BSS). In particular MSPs are formed in binary systems containing a Neutron Star which is spun up through mass accretion from the evolving companion (e.g. Bhattacharia & van den Heuvel 1991). The final stage of this recycling process is either the core of a peeled star (generally an Helium white dwarf) or a very light almos exhausted star, orbiting a very fast rotating Neutron Star (a MSP). Despite the large difference in total mass between the disk of the Galaxy and the Galactic GC system (up a factor 103), the percentage of fast rotating pulsar in binary systems found in the latter is very higher. MSPs in GCs show spin periods in the range 1.3 ÷ 30ms, slowdown rates ˙P 1019 s/s and a lower magnetic field, respect to ”normal” radio pulsars, B 108 gauss . The high probability of disruption of a binary systems after a supernova explosion, explain why we expect only a low percentage of recycled millisecond pulsars respect to the whole pulsar population. In fact only the 10% of the known 1800 radio pulsars are radio MSPs. Is not surprising, that MSP are overabundant in GCs respect to Galactic field, since in the Galactic Disk, MSPs can only form through the evolution of primordial binaries, and only if the binary survives to the supernova explosion which lead to the neutron star formation. On the other hand, the extremely high stellar density in the core of GCs, relative to most of the rest of the Galaxy, favors the formation of several different binary systems, suitable for the recycling of NSs (Davies at al. 1998). In this thesis we will present the properties two millisecond pulsars companions discovered in two globular clusters, the Helium white dwarf orbiting the MSP PSR 1911-5958A in NGC 6752 and the second case of a tidally deformed star orbiting an eclipsing millisecond pulsar, PSR J1701-3006B in NGC6266
Resumo:
Advances in stem cell biology have challenged the notion that infarcted myocardium is irreparable. The pluripotent ability of stem cells to differentiate into specialized cell lines began to garner intense interest within cardiology when it was shown in animal models that intramyocardial injection of bone marrow stem cells (MSCs), or the mobilization of bone marrow stem cells with spontaneous homing to myocardium, could improve cardiac function and survival after induced myocardial infarction (MI) [1, 2]. Furthermore, the existence of stem cells in myocardium has been identified in animal heart [3, 4], and intense research is under way in an attempt to clarify their potential clinical application for patients with myocardial infarction. To date, in order to identify the best one, different kinds of stem cells have been studied; these have been derived from embryo or adult tissues (i.e. bone marrow, heart, peripheral blood etc.). Currently, three different biologic therapies for cardiovascular diseases are under investigation: cell therapy, gene therapy and the more recent “tissue-engineering” therapy . During my Ph.D. course, first I focalised my study on the isolation and characterization of Cardiac Stem Cells (CSCs) in wild-type and transgenic mice and for this purpose I attended, for more than one year, the Cardiovascular Research Institute of the New York Medical College, in Valhalla (NY, USA) under the direction of Doctor Piero Anversa. During this period I learnt different Immunohistochemical and Biomolecular techniques, useful for investigating the regenerative potential of stem cells. Then, during the next two years, I studied the new approach of cardiac regenerative medicine based on “tissue-engineering” in order to investigate a new strategy to regenerate the infracted myocardium. Tissue-engineering is a promising approach that makes possible the creation of new functional tissue to replace lost or failing tissue. This new discipline combines isolated functioning cells and biodegradable 3-dimensional (3D) polymeric scaffolds. The scaffold temporarily provides the biomechanical support for the cells until they produce their own extracellular matrix. Because tissue-engineering constructs contain living cells, they may have the potential for growth and cellular self-repair and remodeling. In the present study, I examined whether the tissue-engineering strategy within hyaluron-based scaffolds would result in the formation of alternative cardiac tissue that could replace the scar and improve cardiac function after MI in syngeneic heterotopic rat hearts. Rat hearts were explanted, subjected to left coronary descending artery occlusion, and then grafted into the abdomen (aorta-aorta anastomosis) of receiving syngeneic rat. After 2 weeks, a pouch of 3 mm2 was made in the thickness of the ventricular wall at the level of the post-infarction scar. The hyaluronic scaffold, previously engineered for 3 weeks with rat MSCs, was introduced into the pouch and the myocardial edges sutured with few stitches. Two weeks later we evaluated the cardiac function by M-Mode echocardiography and the myocardial morphology by microscope analysis. We chose bone marrow-derived mensenchymal stem cells (MSCs) because they have shown great signaling and regenerative properties when delivered to heart tissue following a myocardial infarction (MI). However, while the object of cell transplantation is to improve ventricular function, cardiac cell transplantation has had limited success because of poor graft viability and low cell retention, that’s why we decided to combine MSCs with a biopolimeric scaffold. At the end of the experiments we observed that the hyaluronan fibres had not been substantially degraded 2 weeks after heart-transplantation. Most MSCs had migrated to the surrounding infarcted area where they were especially found close to small-sized vessels. Scar tissue was moderated in the engrafted region and the thickness of the corresponding ventricular wall was comparable to that of the non-infarcted remote area. Also, the left ventricular shortening fraction, evaluated by M-Mode echocardiography, was found a little bit increased when compared to that measured just before construct transplantation. Therefore, this study suggests that post-infarction myocardial remodelling can be favourably affected by the grafting of MSCs delivered through a hyaluron-based scaffold
Resumo:
[EN]The human face provides useful information during interaction; therefore, any system integrating Vision- BasedHuman Computer Interaction requires fast and reliable face and facial feature detection. Different approaches have focused on this ability but only open source implementations have been extensively used by researchers. A good example is the Viola–Jones object detection framework that particularly in the context of facial processing has been frequently used.
Resumo:
We observed 82 healthy subjects, from both sexes, aged between 19 and 77 years. All subjects performed two different tests: for being scientifically acknowledged, the first one was used as a reference and it was a stress test (CPX). During the entire test, heart rate and gas exchange were recorded continuously; the second, the actual object of this study, was a submaximal test (TOP). Only heart rate was recorded continuously. The main purpose was to determinate an index of physical fitness as result of TOP. CPX test allowed us to individuate anaerobic threshold. We used an incremental protocol of 10/20 Watt/min, different by age. For our TOP test we used an RHC400 UPRIGHT BIKE, by Air Machine. Each subject was monitored for heart frequency. After 2 minutes of resting period there was a first step: 3 minutes of pedalling at a constant rate of 60 RPM, (40 watts for elder subjects and 60 watts for the younger ones). Then, the subject was allowed to rest for a recovery phase of 5 minutes. Third and last step consisted of 3 minutes of pedalling again at 60 RPM but now set to 60 watts for elder subjects and 80 watts for the young subjects. Finally another five minutes of recovery. A good correlation was found between TOP and CPX results especially between punctua l heart rate reserve (HRR’) and anaerobic threshold parameters such as Watt, VO2, VCO2 . HRR’ was obtained by subtracting maximal heart rate during TOP from maximal theoretic heart rate (206,9-(0,67*age)). Data were analyzed through cluster analysis in order to obtain 3 homogeneous groups. The first group contains the least fit subjects (inactive, women, elderly). The other groups contain the “average fit” and the fittest subjects (active, men, younger). Concordance between test resulted in 83,23%. Afterwards, a linear combinations of the most relevant variables gave us a formula to classify people in the correct group. The most relevant result is that this submaximal test is able to discriminate subjects with different physical condition and to provide information (index) about physical fitness through HRR’. Compared to a traditional incremental stress test, the very low load of TOP, short duration and extended resting period, make this new method suitable to very different people. To better define the TOP index, it is necessary to enlarge our subject sample especially by diversifying the age range.
Resumo:
The main problem connected to cone beam computed tomography (CT) systems for industrial applications employing 450 kV X-ray tubes is the high amount of scattered radiation which is added to the primary radiation (signal). This stray radiation leads to a significant degradation of the image quality. A better understanding of the scattering and methods to reduce its effects are therefore necessary to improve the image quality. Several studies have been carried out in the medical field at lower energies, whereas studies in industrial CT, especially for energies up to 450 kV, are lacking. Moreover, the studies reported in literature do not consider the scattered radiation generated by the CT system structure and the walls of the X-ray room (environmental scatter). In order to investigate the scattering on CT projections a GEANT4-based Monte Carlo (MC) model was developed. The model, which has been validated against experimental data, has enabled the calculation of the scattering including the environmental scatter, the optimization of an anti-scatter grid suitable for the CT system, and the optimization of the hardware components of the CT system. The investigation of multiple scattering in the CT projections showed that its contribution is 2.3 times the one of primary radiation for certain objects. The results of the environmental scatter showed that it is the major component of the scattering for aluminum box objects of front size 70 x 70 mm2 and that it strongly depends on the thickness of the object and therefore on the projection. For that reason, its correction is one of the key factors for achieving high quality images. The anti-scatter grid optimized by means of the developed MC model was found to reduce the scatter-toprimary ratio in the reconstructed images by 20 %. The object and environmental scatter calculated by means of the simulation were used to improve the scatter correction algorithm which could be patented by Empa. The results showed that the cupping effect in the corrected image is strongly reduced. The developed CT simulation is a powerful tool to optimize the design of the CT system and to evaluate the contribution of the scattered radiation to the image. Besides, it has offered a basis for a new scatter correction approach by which it has been possible to achieve images with the same spatial resolution as state-of-the-art well collimated fan-beam CT with a gain in the reconstruction time of a factor 10. This result has a high economic impact in non-destructive testing and evaluation, and reverse engineering.
Resumo:
This study aims at analysing Brian O'Nolans literary production in the light of a reconsideration of the role played by his two most famous pseudonyms ,Flann Brien and Myles na Gopaleen, behind which he was active both as a novelist and as a journalist. We tried to establish a new kind of relationship between them and their empirical author following recent cultural and scientific surveys in the field of Humour Studies, Psychology, and Sociology: taking as a starting point the appreciation of the comic attitude in nature and in cultural history, we progressed through a short history of laughter and derision, followed by an overview on humour theories. After having established such a frame, we considered an integration of scientific studies in the field of laughter and humour as a base for our study scheme, in order to come to a definition of the comic author as a recognised, powerful and authoritative social figure who acts as a critic of conventions. The history of laughter and comic we briefly summarized, based on the one related by the French scholar Georges Minois in his work (Minois 2004), has been taken into account in the view that humorous attitude is one of manâs characteristic traits always present and witnessed throughout the ages, though subject in most cases to repression by cultural and political conservative power. This sort of Super-Ego notwithstanding, or perhaps because of that, comic impulse proved irreducible exactly in its influence on the current cultural debates. Basing mainly on Robert R. Provineâs (Provine 2001), Fabio Ceccarelliâs (Ceccarelli 1988), Arthur Koestlerâs (Koestler 1975) and Peter L. Bergerâs (Berger 1995) scientific essays on the actual occurrence of laughter and smile in complex social situations, we underlined the many evidences for how the use of comic, humour and wit (in a Freudian sense) could be best comprehended if seen as a common mind process designed for the improvement of knowledge, in which we traced a strict relation with the play-element the Dutch historian Huizinga highlighted in his famous essay, Homo Ludens (Huizinga 1955). We considered comic and humour/wit as different sides of the same coin, and showed how the demonstrations scientists provided on this particular subject are not conclusive, given that the mental processes could not still be irrefutably shown to be separated as regards graduations in comic expression and reception: in fact, different outputs in expressions might lead back to one and the same production process, following the general âEconomy Ruleâ of evolution; man is the only animal who lies, meaning with this that one feeling is not necessarily biuniquely associated with one and the same outward display, so human expressions are not validation proofs for feelings. Considering societies, we found that in nature they are all organized in more or less the same way, that is, in élites who govern over a community who, in turn, recognizes them as legitimate delegates for that task; we inferred from this the epistemological possibility for the existence of an added ruling figure alongside those political and religious: this figure being the comic, who is the person in charge of expressing true feelings towards given subjects of contention. Any community owns one, and his very peculiar status is validated by the fact that his place is within the community, living in it and speaking to it, but at the same time is outside it in the sense that his action focuses mainly on shedding light on ideas and objects placed out-side the boundaries of social convention: taboos, fears, sacred objects and finally culture are the favourite targets of the comic personâs arrow. This is the reason for the word a(rche)typical as applied to the comic figure in society: atypical in a sense, because unconventional and disrespectful of traditions, critical and never at ease with unblinkered respect of canons; archetypical, because the âvillage foolâ, buffoon, jester or anyone in any kind of society who plays such roles, is an archetype in the Jungian sense, i.e. a personification of an irreducible side of human nature that everybody instinctively knows: a beginner of a tradition, the perfect type, what is most conventional of all and therefore the exact opposite of an atypical. There is an intrinsic necessity, we think, of such figures in societies, just like politicians and priests, who should play an elitist role in order to guide and rule not for their own benefit but for the good of the community. We are not naïve and do know that actual owners of power always tend to keep it indefinitely: the âsocial comicâ as a role of power has nonetheless the distinctive feature of being the only job whose tension is not towards stability. It has got in itself the rewarding permission of contradiction, for the very reason we exposed before that the comic must cast an eye both inside and outside society and his vision may be perforce not consistent, then it is satisfactory for the popularity that gives amongst readers and audience. Finally, the difference between governors, priests and comic figures is the seriousness of the first two (fundamentally monologic) and the merry contradiction of the third (essentially dialogic). MPs, mayors, bishops and pastors should always console, comfort and soothe popular mood in respect of the public convention; the comic has the opposite task of provoking, urging and irritating, accomplishing at the same time a sort of control of the soothing powers of society, keepers of the righteousness. In this view, the comic person assumes a paramount importance in the counterbalancing of power administration, whether in form of acting in public places or in written pieces which could circulate for private reading. At this point comes into question our Irish writer Brian O'Nolan(1911-1966), real name that stood behind the more famous masks of Flann O'Brien, novelist, author of At Swim-Two-Birds (1939), The Hard Life (1961), The Dalkey Archive (1964) and, posthumously, The Third Policeman (1967); and of Myles na Gopaleen, journalist, keeper for more than 25 years of the Cruiskeen Lawn column on The Irish Times (1940-1966), and author of the famous book-parody in Irish An Béal Bocht (1941), later translated in English as The Poor Mouth (1973). Brian O'Nolan, professional senior civil servant of the Republic, has never seen recognized his authorship in literary studies, since all of them concentrated on his alter egos Flann, Myles and some others he used for minor contributions. So far as we are concerned, we think this is the first study which places the real name in the title, this way acknowledging him an unity of intents that no-one before did. And this choice in titling is not a mere mark of distinction for the sake of it, but also a wilful sign of how his opus should now be reconsidered. In effect, the aim of this study is exactly that of demonstrating how the empirical author Brian O'Nolan was the real Deus in machina, the master of puppets who skilfully directed all of his identities in planned directions, so as to completely fulfil the role of the comic figure we explained before. Flann O'Brien and Myles na Gopaleen were personae and not persons, but the impression one gets from the critical studies on them is the exact opposite. Literary consideration, that came only after O'Nolans death, began with Anne Clissmannâs work, Flann O'Brien: A Critical Introduction to His Writings (Clissmann 1975), while the most recent book is Keith Donohueâs The Irish Anatomist: A Study of Flann O'Brien (Donohue 2002); passing through M.Keith Bookerâs Flann O'Brien, Bakhtin and Menippean Satire (Booker 1995), Keith Hopperâs Flann O'Brien: A Portrait of the Artist as a Young Post-Modernist (Hopper 1995) and Monique Gallagherâs Flann O'Brien, Myles et les autres (Gallagher 1998). There have also been a couple of biographies, which incidentally somehow try to explain critical points his literary production, while many critical studies do the same on the opposite side, trying to found critical points of view on the authorâs restless life and habits. At this stage, we attempted to merge into O'Nolan's corpus the journalistic articles he wrote, more than 4,200, for roughly two million words in the 26-year-old running of the column. To justify this, we appealed to several considerations about the figure O'Nolan used as writer: Myles na Gopaleen (later simplified in na Gopaleen), who was the equivalent of the street artist or storyteller, speaking to his imaginary public and trying to involve it in his stories, quarrels and debates of all kinds. First of all, he relied much on language for the reactions he would obtain, playing on, and with, words so as to ironically unmask untrue relationships between words and things. Secondly, he pushed to the limit the convention of addressing to spectators and listeners usually employed in live performing, stretching its role in the written discourse to come to a greater effect of involvement of readers. Lastly, he profited much from what we labelled his âspecific weightâ, i.e. the potential influence in society given by his recognised authority in determined matters, a position from which he could launch deeper attacks on conventional beliefs, so complying with the duty of a comic we hypothesised before: that of criticising society even in threat of losing the benefits the post guarantees. That seemingly masochistic tendency has its rationale. Every representative has many privileges on the assumption that he, or she, has great responsibilities in administrating. The higher those responsibilities are, the higher is the reward but also the severer is the punishment for the misfits done while in charge. But we all know that not everybody accepts the rules and many try to use their power for their personal benefit and do not want to undergo lawâs penalties. The comic, showing in this case more civic sense than others, helped very much in this by the non-accessibility to the use of public force, finds in the role of the scapegoat the right accomplishment of his task, accepting the punishment when his breaking of the conventions is too stark to be forgiven. As Ceccarelli demonstrated, the role of the object of laughter (comic, ridicule) has its very own positive side: there is freedom of expression for the person, and at the same time integration in the society, even though at low levels. Then the banishment of a âsocialâ comic can never get to total extirpation from society, revealing how the scope of the comic lies on an entirely fictional layer, bearing no relation with facts, nor real consequences in terms of physical health. Myles na Gopaleen, mastering these three characteristics we postulated in the highest way, can be considered an author worth noting; and the oeuvre he wrote, the whole collection of Cruiskeen Lawn articles, is rightfully a novel because respects the canons of it especially regarding the authorial figure and his relationship with the readers. In addition, his work can be studied even if we cannot conduct our research on the whole of it, this proceeding being justified exactly because of the resemblances to the real figure of the storyteller: its âchaptersâ âthe daily articlesâ had a format that even the distracted reader could follow, even one who did not read each and every article before. So we can critically consider also a good part of them, as collected in the seven volumes published so far, with the addition of some others outside the collections, because completeness in this case is not at all a guarantee of a better precision in the assessment; on the contrary: examination of the totality of articles might let us consider him as a person and not a persona. Once cleared these points, we proceeded further in considering tout court the works of Brian O'Nolan as the works of a unique author, rather than complicating the references with many names which are none other than well-wrought sides of the same personality. By putting O'Nolan as the correct object of our research, empirical author of the works of the personae Flann O'Brien and Myles na Gopaleen, there comes out a clearer literary landscape: the comic author Brian O'Nolan, self-conscious of his paramount role in society as both a guide and a scourge, in a word as an a(rche)typical, intentionally chose to differentiate his personalities so as to create different perspectives in different fields of knowledge by using, in addition, different means of communication: novels and journalism. We finally compared the newly assessed author Brian O'Nolan with other great Irish comic writers in English, such as James Joyce (the one everybody named as the master in the field), Samuel Beckett, and Jonathan Swift. This comparison showed once more how O'Nolan is in no way inferior to these authors who, greatly celebrated by critics, have nonetheless failed to achieve that great public recognition OâNolan received alias Myles, awarded by the daily audience he reached and influenced with his Cruiskeen Lawn column. For this reason, we believe him to be representative of the comic figureâs function as a social regulator and as a builder of solidarity, such as that Raymond Williams spoke of in his work (Williams 1982), with in mind the aim of building a âculture in commonâ. There is no way for a âculture in commonâ to be acquired if we do not accept the fact that even the most functional society rests on conventions, and in a world more and more âconnectedâ we need someone to help everybody negotiate with different cultures and persons. The comic gives us a worldly perspective which is at the same time comfortable and distressing but in the end not harmful as the one furnished by politicians could be: he lets us peep into parallel worlds without moving too far from our armchair and, as a consequence, is the one who does his best for the improvement of our understanding of things.
Resumo:
I rifiuti come oggetti impegnano tutte le istituzioni umane in una lotta di definizione del posto che occupano e quindi del valore che assumono. In tale dinamica la gestione dei rifiuti diventa un fatto sociale totale che coinvolge tutte le istituzioni umane in una lotta di definizione territorializzata. La storia del movimento ambientalista ci mostra come partendo dal disagio nei confronti dell’oggetto si è passati ad un disagio nei confronti delle idee che lo generano. Modernizzazione ecologica e modernizzazione democratica sembrano andare per un certo periodo d’accordo. Nei casi di conflittualità recente, e nello studio di caso approfondito di un piano provinciale della gestione rifiuti, il carattere anticipatore dell’attivismo ambientalista, sta rendendo sempre più costosi e incerti, investimenti e risultati strategici . Anche i principi delle politiche sono messi in discussione. La sostenibilità è da ricercare in una relativizzazione dei principi di policy e degli strumenti tecnici di valutazione (e.g. LCA) verso una maggiore partecipazione di tutti gli attori. Si propone un modello di governance che parta da un coordinamento amministrativo territoriale sulle reti logistiche, quindi un adeguamento geografico degli ATO, e un loro maggior ruolo nella gestione del processo di coordinamento e pianificazione. Azioni queste che devono a loro volta aprirsi ai flussi (ecologici ed economici) e ai loro attori di riferimento: dalle aziende multiutility agli ambientalisti. Infine è necessario un momento di controllo democratico che può avere una funzione arbitrale nei conflitti tra gli attori o di verifica. La ricerca si muove tra la storia e la filosofia, la ricerca empirica e la riflessione teorica. Sono state utilizzate anche tecniche di indagine attiva, come il focus group e l’intervista.