991 resultados para Non-representational methodologies
Resumo:
Los arquitectos y urbanistas tienen una larga tradición en el aprendizaje de las herramientas de las ciencias sociales, especialmente las que les permiten analizar y describir mejor los entornos y las personas para las que trabajan. Esto ha llevado a los arquitectos a desarrollar mejores herramientas de observación y descripción del ámbito social y no sólo el material. Sin embargo, la mayoría de las veces este acercamiento interdisciplinar ha identificado las ciencias sociales, especialmente la antropología, con la etnografía. Este artículo parte de la crítica a esta identificación hecha por el antropólogo Tim Ingold y se centra en lo que él propone como el método central de la antropología, la observación participante. Para después revisar varias propuestas actuales de científicos sociales que tratan de desarrollar una disciplina no representacional y orientada al futuro, un objetivo más cercano al de la arquitectura. El artículo intenta imaginar cómo esta práctica transdisciplinar podría desarrollarse.
Resumo:
In this paper, we compare four different Value-at-Risk (V aR) methodologies through Monte Carlo experiments. Our results indicate that the method based on quantile regression with ARCH effect dominates other methods that require distributional assumption. In particular, we show that the non-robust methodologies have higher probability to predict V aRs with too many violations. We illustrate our findings with an empirical exercise in which we estimate V aR for returns of S˜ao Paulo stock exchange index, IBOVESPA, during periods of market turmoil. Our results indicate that the robust method based on quantile regression presents the least number of violations.
Resumo:
Kirjallisuusarvostelu
Resumo:
Geography has long been a predominantly visual discipline, but recent work in geography has sought to explore the multisensory, embodied, emotional and affective dimensions of people’s relations with places. One way to engage this type of exploration is through the use of sound walks: walks along a specified route accompanied by a soundtrack (on headphones or stationary speakers) that conveys information, enacts a story, produces an ambience or atmosphere, or illuminates certain aspects of the environment through which the listener is walking. This thesis aims to show how geographers can benefit from using sound walks as thinking tools, representational tools and teaching tools. Drawing on my own experiences producing sound walks, I first examine the ways that sound walk production processes help generate productive geographical thinking for those producing sound walks (Chapter Two). The various stages of producing a sound walk require different skill sets, pose different challenges, and require different sorts of environmental awareness, and therefore present novel opportunities for developing geographical insights about specific places or spatial relations. Second, I focus on four experientially-oriented aspects of sound walks – using multiple senses, walking, contingency, and moments of interaction – to argue that sound walks can be useful representational tools for geographers, whether those creating sound walks subscribe to a representational or non-representational theory of knowledge (Chapter Three). The value of sound walks as representational tools is in the experience of ‘doing’ them. That is, audiences discover for themselves through interaction what is being represented, rather than having it delivered to them. The experiential elements of ‘doing’ sound walks recommend them as potentially helpful representational tools for geographers. Third, by examining the work of a small sample of fourth year “Advanced Geography of Music” students, I develop the argument that sound walks can be effective tools for teaching students and for creating circumstances for students to learn independently (Chapter Four). Sound walks have potential to be effective pedagogical tools because they are commensurate with several key pedagogical schools of thought that emphasise the importance of requiring students to engage actively with their environment using a combination of senses. The thesis demonstrates that sound walks are a worthwhile resource for geographers to use theoretically, representationally and pedagogically in their work. The next step is for geographers to put them into practice and realize this potential.
Resumo:
Thrift [2008. Non-representational theory: space, politics, affect, 65. Abingdon: Routledge] has identified disenchantment as “[o]ne of the most damaging ideas” within social scientific and humanities research. As we have argued elsewhere, “[m]etanarratives of disenchantment and their concomitant preoccupation with destructive power go some way toward accounting for the overwhelmingly ‘critical’ character of geographical theory over the last 40 years” [Woodyer, T. and Geoghegan, H., 2013. (Re)enchanting geography? The nature of being critical and the character of critique in human geography. Progress in Human Geography, 37 (2), 195–214]. Through its experimentation with different ways of working and writing, cultural geography plays an important role in challenging extant habits of critical thinking. In this paper, we use the concept of “enchantment” to make sense of the deep and powerful affinities exposed in our research experiences and how these might be used to pursue a critical, yet more cheerful way of engaging with the geographies of the world.
Resumo:
The right ventricle has become an increasing focus in cardiovascular research. In this position paper, we give a brief overview of the specific pathophysiological features of the right ventricle, with particular emphasis on functional and molecular modifications as well as therapeutic strategies in chronic overload, highlighting the differences from the left ventricle. Importantly, we put together recommendations on promising topics of research in the field, experimental study design, and functional evaluation of the right ventricle in experimental models, from non-invasive methodologies to haemodynamic evaluation and ex vivo set-ups.
Resumo:
Várias metodologias de mensuração de risco de mercado foram desenvolvidas e aprimoradas ao longo das últimas décadas. Enquanto algumas metodologias usam abordagens não-paramétricas, outras usam paramétricas. Algumas metodologias são mais teóricas, enquanto outras são mais práticas, usando recursos computacionais através de simulações. Enquanto algumas metodologias preservam sua originalidade, outras metodologias têm abordagens híbridas, juntando características de 2 ou mais metodologias. Neste trabalho, fizemos uma comparação de metodologias de mensuração de risco de mercado para o mercado financeiro brasileiro. Avaliamos os resultados das metodologias não-paramétricas e paramétricas de mensuração de VaR aplicados em uma carteira de renda fixa, renda variável e renda mista durante o período de 2000 a 2006. As metodologias não-paramétricas avaliadas foram: Simulação Histórica pesos fixos, Simulação Histórica Antitética pesos fixos, Simulação Histórica exponencial e Análise de Cenário. E as metodologias paramétricas avaliadas foram: VaR Delta-Normal pesos fixos, VaR Delta-Normal exponencial (EWMA), Simulação de Monte Carlo pesos fixos e Simulação de Monte Carlo exponencial. A comparação destas metodologias foi feita com base em medidas estatísticas de conservadorismo, precisão e eficiência.
Resumo:
The research is part of a survey for the detection of the hydraulic and geotechnical conditions of river embankments funded by the Reno River Basin Regional Technical Service of the Region Emilia-Romagna. The hydraulic safety of the Reno River, one of the main rivers in North-Eastern Italy, is indeed of primary importance to the Emilia-Romagna regional administration. The large longitudinal extent of the banks (several hundreds of kilometres) has placed great interest in non-destructive geophysical methods, which, compared to other methods such as drilling, allow for the faster and often less expensive acquisition of high-resolution data. The present work aims to experience the Ground Penetrating Radar (GPR) for the detection of local non-homogeneities (mainly stratigraphic contacts, cavities and conduits) inside the Reno River and its tributaries embankments, taking into account supplementary data collected with traditional destructive tests (boreholes, cone penetration tests etc.). A comparison with non-destructive methodologies likewise electric resistivity tomography (ERT), Multi-channels Analysis of Surface Waves (MASW), FDEM induction, was also carried out in order to verify the usability of GPR and to provide integration of various geophysical methods in the process of regular maintenance and check of the embankments condition. The first part of this thesis is dedicated to the explanation of the state of art concerning the geographic, geomorphologic and geotechnical characteristics of Reno River and its tributaries embankments, as well as the description of some geophysical applications provided on embankments belonging to European and North-American Rivers, which were used as bibliographic basis for this thesis realisation. The second part is an overview of the geophysical methods that were employed for this research, (with a particular attention to the GPR), reporting also their theoretical basis and a deepening of some techniques of the geophysical data analysis and representation, when applied to river embankments. The successive chapters, following the main scope of this research that is to highlight advantages and drawbacks in the use of Ground Penetrating Radar applied to Reno River and its tributaries embankments, show the results obtained analyzing different cases that could yield the formation of weakness zones, which successively lead to the embankment failure. As advantages, a considerable velocity of acquisition and a spatial resolution of the obtained data, incomparable with respect to other methodologies, were recorded. With regard to the drawbacks, some factors, related to the attenuation losses of wave propagation, due to different content in clay, silt, and sand, as well as surface effects have significantly limited the correlation between GPR profiles and geotechnical information and therefore compromised the embankment safety assessment. Recapitulating, the Ground Penetrating Radar could represent a suitable tool for checking up river dike conditions, but its use has significantly limited by geometric and geotechnical characteristics of the Reno River and its tributaries levees. As a matter of facts, only the shallower part of the embankment was investigate, achieving also information just related to changes in electrical properties, without any numerical measurement. Furthermore, GPR application is ineffective for a preliminary assessment of embankment safety conditions, while for detailed campaigns at shallow depth, which aims to achieve immediate results with optimal precision, its usage is totally recommended. The cases where multidisciplinary approach was tested, reveal an optimal interconnection of the various geophysical methodologies employed, producing qualitative results concerning the preliminary phase (FDEM), assuring quantitative and high confidential description of the subsoil (ERT) and finally, providing fast and highly detailed analysis (GPR). Trying to furnish some recommendations for future researches, the simultaneous exploitation of many geophysical devices to assess safety conditions of river embankments is absolutely suggested, especially to face reliable flood event, when the entire extension of the embankments themselves must be investigated.
Resumo:
Introduction: Computer-Aided-Design (CAD) and Computer-Aided-Manufacture (CAM) has been developed to fabricate fixed dental restorations accurately, faster and improve cost effectiveness of manufacture when compared to the conventional method. Two main methods exist in dental CAD/CAM technology: the subtractive and additive methods. While fitting accuracy of both methods has been explored, no study yet has compared the fabricated restoration (CAM output) to its CAD in terms of accuracy. The aim of this present study was to compare the output of various dental CAM routes to a sole initial CAD and establish the accuracy of fabrication. The internal fit of the various CAM routes were also investigated. The null hypotheses tested were: 1) no significant differences observed between the CAM output to the CAD and 2) no significant differences observed between the various CAM routes. Methods: An aluminium master model of a standard premolar preparation was scanned with a contact dental scanner (Incise, Renishaw, UK). A single CAD was created on the scanned master model (InciseCAD software, V2.5.0.140, UK). Twenty copings were then fabricated by sending the single CAD to a multitude of CAM routes. The copings were grouped (n=5) as: Laser sintered CoCrMo (LS), 5-axis milled CoCrMo (MCoCrMo), 3-axis milled zirconia (ZAx3) and 4-axis milled zirconia (ZAx4). All copings were micro-CT scanned (Phoenix X-Ray, Nanotom-S, Germany, power: 155kV, current: 60µA, 3600 projections) to produce 3-Dimensional (3D) models. A novel methodology was created to superimpose the micro-CT scans with the CAD (GOM Inspect software, V7.5SR2, Germany) to indicate inaccuracies in manufacturing. The accuracy in terms of coping volume was explored. The distances from the surfaces of the micro-CT 3D models to the surfaces of the CAD model (CAD Deviation) were investigated after creating surface colour deviation maps. Localised digital sections of the deviations (Occlusal, Axial and Cervical) and selected focussed areas were then quantitatively measured using software (GOM Inspect software, Germany). A novel methodology was also explored to digitally align (Rhino software, V5, USA) the micro-CT scans with the master model to investigate internal fit. Fifty digital cross sections of the aligned scans were created. Point-to-point distances were measured at 5 levels at each cross section. The five levels were: Vertical Marginal Fit (VF), Absolute Marginal Fit (AM), Axio-margin Fit (AMF), Axial Fit (AF) and Occlusal Fit (OF). Results: The results of the volume measurement were summarised as: VM-CoCrMo (62.8mm3 ) > VZax3 (59.4mm3 ) > VCAD (57mm3 ) > VZax4 (56.1mm3 ) > VLS (52.5mm3 ) and were all significantly different (p presented as areas with different colour. No significant differences were observed at the internal aspect of the cervical aspect between all groups of copings. Significant differences (p< M-CoCrMo Internal Occlusal, Internal Axial and External Axial 2 ZAx3 > ZAx4 External Occlusal, External Cervical 3 ZAx3 < ZAx4 Internal Occlusal 4 M-CoCrMo > ZAx4 Internal Occlusal and Internal Axial The mean values of AMF and AF were significantly (p M-CoCrMo and CAD > ZAx4. Only VF of M-CoCrMo was comparable with the CAD Internal Fit. All VF and AM values were within the clinically acceptable fit (120µm). Conclusion: The investigated CAM methods reproduced the CAD accurately at the internal cervical aspect of the copings. However, localised deviations at axial and occlusal aspects of the copings may suggest the need for modifications in these areas prior to fitting and veneering with porcelain. The CAM groups evaluated also showed different levels of Internal Fit thus rejecting the null hypotheses. The novel non-destructive methodologies for CAD/CAM accuracy and internal fit testing presented in this thesis may be a useful evaluation tool for similar applications.
Resumo:
El presente documento analiza como el Estado colombiano ha querido crear una identidad nacional en tres Exposiciones Internacionales a partir de representaciones elaboradas con un discurso entre político, comercial y cultural, generando imágenes que no siempre concuerdan con la realidad.
Resumo:
This paper presents some methodologies for reactive energy measurement, considering three modern power theories that are suitable for three-phase four-wire non-sinusoidal and unbalanced circuits. The theories were applied in some profiles collected in electrical distribution systems which have real characteristics for voltages and currents measured by commercial reactive energy meters. The experimental results are presented in order to analyze the accuracy of the methodologies, considering the standard IEEE 1459-2010 as a reference. Finally, for additional comparisons, the theories will be confronted with the modern Yokogawa WT3000 energy meter and three samples of a commercial energy meter through an experimental setup. © 2011 IEEE.
Resumo:
Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.
Resumo:
Routine monitoring of environmental pollution demands simplicity and speed without sacrificing sensitivity or accuracy. The development and application of sensitive, fast and easy to implement analytical methodologies for detecting emerging and traditional water and airborne contaminants in South Florida is presented. A novel method was developed for quantification of the herbicide glyphosate based on lyophilization followed by derivatization and simultaneous detection by fluorescence and mass spectrometry. Samples were analyzed from water canals that will hydrate estuarine wetlands of Biscayne National Park, detecting inputs of glyphosate from both aquatic usage and agricultural runoff from farms. A second study describes a set of fast, automated LC-MS/MS protocols for the analysis of dioctyl sulfosuccinate (DOSS) and 2-butoxyethanol, two components of Corexit®. Around 1.8 million gallons of those dispersant formulations were used in the response efforts for the Gulf of Mexico oil spill in 2010. The methods presented here allow the trace-level detection of these compounds in seawater, crude oil and commercial dispersants formulations. In addition, two methodologies were developed for the analysis of well-known pollutants, namely Polycyclic Aromatic Hydrocarbons (PAHs) and airborne particulate matter (APM). PAHs are ubiquitous environmental contaminants and some are potent carcinogens. Traditional GC-MS analysis is labor-intensive and consumes large amounts of toxic solvents. My study provides an alternative automated SPE-LC-APPI-MS/MS analysis with minimal sample preparation and a lower solvent consumption. The system can inject, extract, clean, separate and detect 28 PAHs and 15 families of alkylated PAHs in 28 minutes. The methodology was tested with environmental samples from Miami. Airborne Particulate Matter is a mixture of particles of chemical and biological origin. Assessment of its elemental composition is critical for the protection of sensitive ecosystems and public health. The APM collected from Port Everglades between 2005 and 2010 was analyzed by ICP-MS after acid digestion of filters. The most abundant elements were Fe and Al, followed by Cu, V and Zn. Enrichment factors show that hazardous elements (Cd, Pb, As, Co, Ni and Cr) are introduced by anthropogenic activities. Data suggest that the major sources of APM were an electricity plant, road dust, industrial emissions and marine vessels.
Resumo:
The porpoise of this study was to implement research methodologies and assess the effectiveness and impact of management tools to promote best practices for the long term conservation of the endangered African wild dog (Lycaon pictus). Different methods were included in the project framework to investigate and expand the applicability of these methodologies to free-ranging African wild dogs in the southern African region: ethology, behavioural endocrinology and ecology field methodologies were tested and implemented. Additionally, research was performed to test the effectiveness and implication of a contraceptive implant (Suprenolin) as a management tool for the species of a subpopulation hosted in fenced areas. Attention was especially given to social structure and survival of treated packs. This research provides useful tools and advances the applicability of these methods for field studies, standardizing and improving research instruments in the field of conservation biology and behavioural endocrinology. Results reported here provide effective methodologies to expand the applicability of non-invasive endocrine assessment to previously prohibited fields, and validation of sampling methods for faecal hormone analysis. The final aim was to fill a knowledge gap on behaviours of the species and provide a common ground for future researchers to apply non-invasive methods to this species research and to test the effectiveness of the contraception on a managed metapopulation.
Resumo:
Pathogen detection in foods by reliable methodologies is very important to guarantee microbilogical safety. However, peculiar characteristics of certain foods, such as autochthonous microbiota, can directly influence pathogen development and detection. With the objective of verifying the performance of the official analytical methodologies for the isolation of Listeria monocytogenes and Salmonella in milk, different concentrations of these pathogens were inoculated in raw milk treatments with different levels of mesophilic aerobes, and then submitted to the traditional isolation procedures for the inoculated pathogens. Listeria monocytogenes was inoculated at the range of 0.2-5.2 log CFU/mL in treatments with 1.8-8.2 log CFU/mL. Salmonella Enteritidis was inoculated at 0.9-3.9 log CFU/mL in treatments with 3.0-8.2 log CFU/mL. The results indicated that recovery was not possible or was more difficult in the treatments with high counts of mesophilic aerobes and low levels of the pathogens, indicating interference of raw milk autochthonous microbiota. This interference was more evident for L. monocytogenes, once the pathogen recovery was not possible in treatments with mesophilic aerobes up to 4.0 log CFU/mL and inoculum under 2.0 log CFU/mL. For S. Enteritidis the interference appeared to be more non-specific. (C) 2007 Elsevier GmbH. All rights reserved.