679 resultados para WWII artefacts
Resumo:
This project engages people with learning disabilities to participate as co-researchers and explore museum interpretation through multisensory workshops using microcontrollers and sensors to enable alternative interactive visitor experiences in museums and heritage sites. This article describes how the project brings together artists, engineers, and experts in multimedia advocacy, as well as people with learning disabilities in the co-design of interactive multisensory objects that replicate or respond to objects of cultural significance in our national collections. Through a series of staged multi-sensory art and electronics workshops, people with learning disabilities explore how the different senses could be utilised to augment existing artefacts or create entirely new ones. The co-researchers employ multimedia advocacy tools to reflect on and to communicate their experiences and findings.
Resumo:
Current commercially available Doppler lidars provide an economical and robust solution for measuring vertical and horizontal wind velocities, together with the ability to provide co- and cross-polarised backscatter profiles. The high temporal resolution of these instruments allows turbulent properties to be obtained from studying the variation in radial velocities. However, the instrument specifications mean that certain characteristics, especially the background noise behaviour, become a limiting factor for the instrument sensitivity in regions where the aerosol load is low. Turbulent calculations require an accurate estimate of the contribution from velocity uncertainty estimates, which are directly related to the signal-to-noise ratio. Any bias in the signal-to-noise ratio will propagate through as a bias in turbulent properties. In this paper we present a method to correct for artefacts in the background noise behaviour of commercially available Doppler lidars and reduce the signal-to-noise ratio threshold used to discriminate between noise, and cloud or aerosol signals. We show that, for Doppler lidars operating continuously at a number of locations in Finland, the data availability can be increased by as much as 50 % after performing this background correction and subsequent reduction in the threshold. The reduction in bias also greatly improves subsequent calculations of turbulent properties in weak signal regimes.
Resumo:
We present a minor but essential modification to the CODEX 1D-MAS exchange experiment. The new CONTRA method, which requires minor changes of the original sequence only, has advantages over the previously introduced S-CODEX, since it is less sensitive to artefacts caused by finite pulse lengths. The performance of this variant, including the finite pulse effect, was confirmed by SIMPSON calculations and demonstrated on a number of dynamic systems. (C) 2007 Elsevier Inc. All rights reserved.
Resumo:
This paper describes an automatic device for in situ and continuous monitoring of the ageing process occurring in natural and synthetic resins widely used in art and in the conservation and restoration of cultural artefacts. The results of tests carried out under accelerated ageing conditions are also presented. This easy-to-assemble palm-top device, essentially consists of oscillators based on quartz crystal resonators coated with films of the organic materials whose response to environmental stress is to be addressed. The device contains a microcontroller which selects at pre-defined time intervals the oscillators and records and stores their oscillation frequency. The ageing of the coatings, caused by the environmental stress and resulting in a shift in the oscillation frequency of the modified crystals, can be straightforwardly monitored in this way. The kinetics of this process reflects the level of risk damage associated with a specific microenvironment. In this case, natural and artificial resins, broadly employed in art and restoration of artistic and archaeological artefacts (dammar and Paraloid B72), were applied onto the crystals. The environmental stress was represented by visible and UV radiation, since the chosen materials are known to be photochemically active, to different extents. In the case of dammar, the results obtained are consistent with previous data obtained using a bench-top equipment by impedance analysis through discrete measurements and confirm that the ageing of this material is reflected in the gravimetric response of the modified quartz crystals. As for Paraloid B72, the outcome of the assays indicates that the resin is resistant to visible light, but is very sensitive to UV irradiation. The use of a continuous monitoring system, apart from being obviously more practical, is essential to identify short-term (i.e. reversible) events, like water vapour adsorption/desorption processes, and to highlight ageing trends or sudden changes of such trends. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
After the WWII, there was much concern to protect human rights situation all over the world. During the cold wars, huge displacement took place within different countries due to internal arms/ethnic conflicts. Millions of IDPs, who were uprooted by armed conflict or ethnic strife faced human rights violence. In 2002, there were estimated between 20-25 millions IDPs in the world (Phuong, p.1). Internally displacement is a worldwide problem and millions of the people displaced in Africa and Asia. These all Internal displacements of the people are only the result of the conflicts or the violations of the Human Rights but also sometimes it happened because of the natural disasters. “All human beings are born free and equal in dignity and rights..."(Streich, Article 1) This article works as the foundation of human rights which gives every human being an equal rights and opportunity to maintain his/her dignity. Human Rights issues related to human dignity must be taken very seriously and should not be ignored at any level; Many human rights issues are not always visible, issues such as: privacy, security, equality, protection of social and cultural values etc. In this paper I am going to apply theoretical approach of “all human being are equal in dignity and rights” to defend IDPs rights.
Resumo:
The provenance of entities, whether electronic data or physical artefacts, is crucial information in practically all domains, including science, business and art. The increased use of software in automating activities provides the opportunity to add greatly to the amount we can know about an entityâ??s history and the process by which it came to be as it is. However, it also presents difficulties: querying for the provenance of an entity could potentially return detailed information stretching back to the beginning of time, and most of it possibly irrelevant to the querier. In this paper, we define the concept of provenance query and describe techniques that allow us to perform scoped provenance queries.
Resumo:
In her January 20, 2015 interview with Michelle Dubert-Bellrichard, Jeuel Esmacher Bannister details her time at Winthrop from 1940-1944. Shared are the memories of professors in the music department, her opinions on the expectations of students, and going to school during WWII. Esmacher Bannister recalls stories of the Army Air Corps Cadets on campus, and the courses offered by the U.S. government that led Esmacher Bannister to a career as a Japanese and Russian cryptographer. This interview was conducted for inclusion into the Louise Pettus Archives and Special Collections Oral History Program.
Resumo:
The presence of deterministic or stochastic trend in U.S. GDP has been a continuing debate in the literature of macroeconomics. Ben-David and Papell (1995) found evindence in favor of trend stationarity using the secular sample of Maddison (1995). More recently, Murray and Nelson (2000) correctly criticized this nding arguing that the Maddison data are plagued with additive outliers (AO), which bias inference towards stationarity. Hence, they propose to set the secular sample aside and conduct inference using a more homogeneous but shorter time-span post-WWII sample. In this paper we re-visit the Maddison data by employing a test that is robust against AO s. Our results suggest the U.S. GDP can be modeled as a trend stationary process.
Resumo:
Lucas (1987) has shown the surprising result that the welfare cost of business cycles is quite small. Using standard assumptions on preferences and a fully-áedged econometric model we computed the welfare costs of macroeconomic uncertainty for the post-WWII era using the multivariate Beveridge-Nelson decomposition for trends and cycles, which considers not only business-cycle uncertainty but also uncertainty from the stochastic trend in consumption. The post-WWII period is relatively quiet, with the welfare costs of uncertainty being about 0:9% of per-capita consumption. Although changing the decomposition method changed substantially initial results, the welfare cost of uncertainty is qualitatively small in the post-WWII era - about $175.00 a year per-capita in the U.S. We also computed the marginal welfare cost of macroeconomic uncertainty using this same technique. It is about twice as large as the welfare cost ñ$350.00 a year per-capita.
Resumo:
This thesis is composed of three articles with the subjects of macroeconomics and - nance. Each article corresponds to a chapter and is done in paper format. In the rst article, which was done with Axel Simonsen, we model and estimate a small open economy for the Canadian economy in a two country General Equilibrium (DSGE) framework. We show that it is important to account for the correlation between Domestic and Foreign shocks and for the Incomplete Pass-Through. In the second chapter-paper, which was done with Hedibert Freitas Lopes, we estimate a Regime-switching Macro-Finance model for the term-structure of interest rates to study the US post-World War II (WWII) joint behavior of macro-variables and the yield-curve. We show that our model tracks well the US NBER cycles, the addition of changes of regime are important to explain the Expectation Theory of the term structure, and macro-variables have increasing importance in recessions to explain the variability of the yield curve. We also present a novel sequential Monte-Carlo algorithm to learn about the parameters and the latent states of the Economy. In the third chapter, I present a Gaussian A ne Term Structure Model (ATSM) with latent jumps in order to address two questions: (1) what are the implications of incorporating jumps in an ATSM for Asian option pricing, in the particular case of the Brazilian DI Index (IDI) option, and (2) how jumps and options a ect the bond risk-premia dynamics. I show that jump risk-premia is negative in a scenario of decreasing interest rates (my sample period) and is important to explain the level of yields, and that gaussian models without jumps and with constant intensity jumps are good to price Asian options.
Resumo:
Lucas (1987) has shown a surprising result in business-cycle research, that the welfare cost of business cycles are relatively small. Using standard assumptions on preferences and a reasonable reduced form for consumption, we computed these welfare costs for the pre- and post-WWII era, using three alternative trend-cycle decomposition methods. The post-WWII period is very era this basic result is dramatically altered. For the Beveridge and Nelson decomposition, and reasonable preference parameter and discount values, we get a compensation of about 5% of consumption, which is by all means a sizable welfare cost (about US$ 1,000.00 a year).
Resumo:
O tema da responsabilidade social empresarial vem ganhando força e atenção tanto das organizações empresariais quanto da sociedade civil. Os impactos do mundo pós-guerra, da globalização e do desenvolvimento somam consequências ambientais, sociais e poluição. Como atores inseridos nessas realidades, as empresas também sofrem as consequências destas mudanças. O próprio papel das organizações na sociedade e os limites e alcances de sua atuação recebem novas perspectivas. A conjuntura atual traz às empresas novos desafios e possibilidades no que se refere à Responsabilidade Corporativa. Estas mudanças têm impacto na própria organização da empresa (seus objetivos, relacionamento com consumidores, trabalhadores etc.) e, como consequência, em suas próprias estratégias. Grande parte das ações de Responsabilidade Corporativa se baseia em visões estreitas de atuação – essencialmente, uma postura reativa, como consequência de pressões e obrigações. No entanto, tem-se observado a ascensão de uma nova proposta, mais proativa: em algumas empresas pioneiras, a Responsabilidade Corporativa é vista como parte integrante e fundamental da estratégia empresarial, gerando benefícios não apenas para a sociedade, mas também para a própria empresa. O presente trabalho inicia-se com uma investigação da literatura sobre a evolução da Responsabilidade Social. Em seguida, propõe-se um modelo conceitual, relacionando Responsabilidade Corporativa e Estratégia, o qual expõe os caminhos possíveis de integração entre estes. Parte-se, portanto da ideia de que a Responsabilidade Corporativa pode ser abordada como uma questão estratégica, capaz de atender, em diferentes níveis e profundidades, pontos cruciais para o bom desempenho empresarial. A aplicabilidade do modelo proposto é demonstrada através da análise de exemplos ilustrativos de empresas que vêm incorporando a Responsabilidade Corporativa como parte integrante de sua estratégia de negócio. Como conclusão, constata-se que, quando é parte de uma estratégia formal, a adoção de uma abordagem responsável não só traz ganhos de imagem, mas também gera, para empresas que a praticam, resultados tangíveis, ao mesmo tempo em pode assegurar, para a Sociedade, um mundo mais sustentável, com melhor qualidade de vida para todos.
Resumo:
Lucas (1987) has shown a surprising result in business-cycle research: the welfare cost of business cycles are very small. Our paper has several original contributions. First, in computing welfare costs, we propose a novel setup that separates the effects of uncertainty stemming from business-cycle fluctuations and economic-growth variation. Second, we extend the sample from which to compute the moments of consumption: the whole of the literature chose primarily to work with post-WWII data. For this period, actual consumption is already a result of counter-cyclical policies, and is potentially smoother than what it otherwise have been in their absence. So, we employ also pre-WWII data. Third, we take an econometric approach and compute explicitly the asymptotic standard deviation of welfare costs using the Delta Method. Estimates of welfare costs show major differences for the pre-WWII and the post-WWII era. They can reach up to 15 times for reasonable parameter values -β=0.985, and ∅=5. For example, in the pre-WWII period (1901-1941), welfare cost estimates are 0.31% of consumption if we consider only permanent shocks and 0.61% of consumption if we consider only transitory shocks. In comparison, the post-WWII era is much quieter: welfare costs of economic growth are 0.11% and welfare costs of business cycles are 0.037% - the latter being very close to the estimate in Lucas (0.040%). Estimates of marginal welfare costs are roughly twice the size of the total welfare costs. For the pre-WWII era, marginal welfare costs of economic-growth and business- cycle fluctuations are respectively 0.63% and 1.17% of per-capita consumption. The same figures for the post-WWII era are, respectively, 0.21% and 0.07% of per-capita consumption.
Resumo:
Lucas(1987) has shown a surprising result in business-cycle research: the welfare cost of business cycles are very small. Our paper has several original contributions. First, in computing welfare costs, we propose a novel setup that separates the effects of uncertainty stemming from business-cycle uctuations and economic-growth variation. Second, we extend the sample from which to compute the moments of consumption: the whole of the literature chose primarily to work with post-WWII data. For this period, actual consumption is already a result of counter-cyclical policies, and is potentially smoother than what it otherwise have been in their absence. So, we employ also pre-WWII data. Third, we take an econometric approach and compute explicitly the asymptotic standard deviation of welfare costs using the Delta Method. Estimates of welfare costs show major diferences for the pre-WWII and the post-WWII era. They can reach up to 15 times for reasonable parameter values = 0:985, and = 5. For example, in the pre-WWII period (1901-1941), welfare cost estimates are 0.31% of consumption if we consider only permanent shocks and 0.61% of consumption if we consider only transitory shocks. In comparison, the post-WWII era is much quieter: welfare costs of economic growth are 0.11% and welfare costs of business cycles are 0.037% the latter being very close to the estimate in Lucas (0.040%). Estimates of marginal welfare costs are roughly twice the size of the total welfare costs. For the pre-WWII era, marginal welfare costs of economic-growth and business-cycle uctuations are respectively 0.63% and 1.17% of per-capita consumption. The same gures for the post-WWII era are, respectively, 0.21% and 0.07% of per-capita consumption.
Resumo:
Over the last decades, the digital inclusion public policies have significantly invested in the purchase of hardwares and softwares in order to offer technology to the Brazilian public teaching institutions, specifically computers and broadband Internet. However, the teachers education to handle these artefacts is put away, even though there is some demand from the information society. With that, this dissertation chooses as an object of study the digital literacy practices performed by 38 (thirty-eight) teachers in initial and continuous education by means of the extension course Literacies and technologies: portuguese language teaching and cyberculture demands. In this direction, we aim at investigating the digital literacy practices of developing teachers in three specific moments: before, while and after this extension action with the intent to (i) delineate the digital literacy practices performed by the collaborators before the formative action; (ii) to narrate the literacy events made possible by the extension course; (iii) to investigate the contributions of the education course to the collaborators teaching practice. We sought theoretical contributions in the literacy studies (BAYNHAM, 1995; KLEIMAN, 1995; HAMILTON; BARTON; IVANIC, 2000), specifically when it comes to digital literacy (COPE, KALANTZIS, 2000; BUZATO, 2001, 2007, 2009; SNYDER, 2002, 2008; LANKSHEAR & KNOBEL, 2002, 2008) and teacher education (PERRENOUD, 2000; SILVA, 2001). Methodologically, this virtual ethnography study (KOZINETS, 1997; HINE, 2000) is inserted into the field of Applied Linguistics and adopts a quali-quantitative research approach (NUNAN, 1992; DÖRNYEI, 2006). The data analysis permitted to evidentiate that (i) before the course, the digital literacy practices focused on the personal and academic dimensions of their realities at the expense of the professional dimension; (ii) during the extension action, the teachers collaboratively took part in the hybrid study sessions, which had a pedagogical focus on the use of ICTs, accomplishing the use of digital literacy practices - unknown before that; (iii) after the course, the attitude of the collaborator teachers concerning the use of ICTs on their regular professional basis had changed, once those teachers started to effectively make use of them, promoting social visibility to what was produced in the school. We also observed that teachers in initial education acted as more experienced peers in collaborative learning process, offering support scaffolding (VYGOTSKY, 1978; BRUNER, 1985) to teachers in continuous education. This occurred because of the undergraduates actualize digital literacy practices were more sophisticated, besides the fact being integrate generation Y (PRENSKY, 2001)