796 resultados para Notion of code
Resumo:
Heritage is defined by history which is by nature multi layered. The passage of time and the perspectives it affords, enables and even necessitates constant reexamination and reinterpretation of history. What effect do changes in historical perspective then have upon the definition of heritage which relies on an understanding of its history? The present paper attempts to engage with the notion of heritage, criteria of its definition, and the mutable nature of such designations with specific reference to architectural constructions and historical cities that enjoy or have enjoyed in the past the status of a ‘World Heritage Site’. Examples such as the Louvre museum in Paris or the King’s Cross station in London make an interesting study as they not only allow insight into the past but reflect the changes and adaptation over a period of time. Multiple alterations, some very recently, have modified them extensively since the time they were accorded the ‘World Heritage Site’ status. The above examples are contrasted by sites ridden with conflict such as the Bamiyan Valley. This site has been placed under the ‘World Heritage In Danger’ list by UNESCO taking into account the destruction of the Buddha statues in the region. The act of vandalism itself has had dual implications. While causing an irreparable loss to mankind of its heritage, it also serves as an effective symbol of religious fanaticism that is a pressing concern of our times. The paper then moves on to explore the case of Dresden which lost its ‘World Heritage’ status with the construction of the Waldschlösschen Bridge. This is a particularly interesting case because with the absolute destruction of the city during the Second World War, it was necessary to reconstruct the historical city while simultaneously acknowledging and addressing the modern day requirements. During the reconstruction, with the readaptation of the spaces, it was almost impossible to replicate the original architectural program or to undertake such a large reconstruction project employing only the traditional techniques and materials. This essentially made it a new city constructed in the image of the old. The recent necessity of a growing city was met by the construction of a bridge that has caused it to lose its ‘World Heritage’ status. Finally, this paper endeavours to foster discussion of questions central to the definition of heritage such as what happens when we have to adapt a living space to avoid its deterioration and descent into dereliction by overuse. Does it necessarily lose its historical value? What exactly is Historical value?.
Resumo:
Police is Dead is an historiographic analysis whose objective is to change the terms by which contemporary humanist scholarship assesses the phenomenon currently termed neoliberalism. It proceeds by building an archeology of legal thought in the United States that spans the nineteenth and twentieth centuries. My approach assumes that the decline of certain paradigms of political consciousness set historical conditions that enable the emergence of what is to follow. The particular historical form of political consciousness I seek to reintroduce to the present is what I call “police:” a counter-liberal way of understanding social relations that I claim has particular visibility within a legal archive, but that has been largely ignored by humanist theory on account of two tendencies: first, an over-valuation of liberalism as Western history’s master signifier; and second, inconsistent and selective attention to law as a cultural artifact. The first part of my dissertation reconstructs an anatomy of police through close studies of court opinions, legal treatises, and legal scholarship. I focus in particular on juridical descriptions of intimate relationality—which police configured as a public phenomenon—and slave society apologetics, which projected the notion of community as an affective and embodied structure. The second part of this dissertation demonstrates that the dissolution of police was critical to emergence of a paradigm I call economism: an originally progressive economic framework for understanding social relations that I argue developed at the nexus of law and economics at the turn of the twentieth century. Economism is a way of understanding sociality that collapses ontological distinctions between formally distinct political subjects—i.e., the state, the individual, the collective—by reducing them to the perspective of economic force. Insofar as it was taken up and reoriented by neoliberal theory, this paradigm has become a hegemonic form of political consciousness. This project concludes by encouraging a disarticulation of economism—insofar as it is a form of knowledge—from neoliberalism as its contemporary doctrinal manifestation. I suggest that this is one way progressive scholarship can think about moving forward in the development of economic knowledge, rather than desiring to move backwards to a time before the rise of neoliberalism. Disciplinarily, I aim to show that understanding the legal historiography informing our present moment is crucial to this task.
Resumo:
This paper presents an economic model of the effects of identity and social norms on consumption patterns. By incorporating qualitative studies in psychology and sociology, I propose a utility function that features two components – economic (functional) and identity elements. This setup is extended to analyze a market comprising a continuum of consumers, whose identity distribution along a spectrum of binary identities is described by a Beta distribution. I also introduce the notion of salience in the context of identity and consumption decisions. The key result of the model suggests that fundamental economic parameters, such as price elasticity and market demand, can be altered by identity elements. In addition, it predicts that firms in perfectly competitive markets may associate their products with certain types of identities, in order to reduce product substitutability and attain price-setting power.
Resumo:
Background: I conducted my research in the context of The National Literacy Strategy (DES, 2011), which maintains that every young person should be literate and it outlines targets for improving literacy in schools from 2011 to 2020. There has been much debate on the teaching of literacy and in particular the teaching of reading. Clark (2014) outlines how learning to read should be a developmental language process and that the approaches in the early years of schooling will colour the children’s motivation and their perception of reading as a purposeful activity. The acquisition of literacy begins in the home but this study focuses on the implementation of a literacy intervention Station Teaching in the infant classes in primary school. Station Teaching occurs when a class is divided into four or five small groups of pupils and they receive intensive tuition at four or five different Stations with the help of Support teachers: New Reading, Familiar Reading, Phonics, Writing and Oral Language. Research Questions: These research questions frame my study: How is Station Teaching implemented? What is the experience of the intervention Station Teaching from the participants’ point of view: teachers, pupils, parents? What notion of literacy is Station Teaching facilitating? Methods: I chose a pragmatic parallel mixed methods design as suggested by Mertens (2010). I collected and analysed both the quantitative and qualitative data to answer the study’s research questions. In the study the quantitative data were collected from a questionnaire issued to 21 schools in Ireland. I used Excel as a data management package and thematic analysis to analyse and present the data in themes. I collected qualitative data from a case study in a school. This data included observations of two classes over a period of a year; interviews with teachers, pupils and parents; children’s drawings, photographs, teachers’ diaries and video evidence. I analysed and presented the evidence from the qualitative data in themes. Main Findings: There are many skills and strategies that are essential to effective literacy teaching in the early years including phonological awareness, phonics, vocabulary, fluency, comprehension and writing. These skills can be taught during Station Teaching. Early intervention in the early years is essential to pupils’ acquisition of literacy. The expertise of the teacher is key to improving the literacy achievement of pupils Teachers and pupils enjoy participating in ST. Pupils are motivated to read and engage in meaningful activities during ST. Staff collaboration is vital for ST to succeed ST facilitates small group work and teachers can differentiate accordingly while including all pupils in the groups. Pupils’ learning is extended in ST but extension activities need to be addressed in the Writing Station. More training should be provided for teachers on the implementation of ST and more funding for resources should be available to schools Significant contribution of the work: The main significance of the study includes: insights into the classroom implementation of Station Teaching in infant classes and extensive research into characteristics of an effective teacher of literacy.
Resumo:
The increasing nationwide interest in intelligent transportation systems (ITS) and the need for more efficient transportation have led to the expanding use of variable message sign (VMS) technology. VMS panels are substantially heavier than flat panel aluminum signs and have a larger depth (dimension parallel to the direction of traffic). The additional weight and depth can have a significant effect on the aerodynamic forces and inertial loads transmitted to the support structure. The wind induced drag forces and the response of VMS structures is not well understood. Minimum design requirements for VMS structures are contained in the American Association of State Highway Transportation Officials Standard Specification for Structural Support for Highway Signs, Luminaires, and Traffic Signals (AASHTO Specification). However the Specification does not take into account the prismatic geometry of VMS and the complex interaction of the applied aerodynamic forces to the support structure. In view of the lack of code guidance and the limited number research performed so far, targeted experimentation and large scale testing was conducted at the Florida International University (FIU) Wall of Wind (WOW) to provide reliable drag coefficients and investigate the aerodynamic instability of VMS. A comprehensive range of VMS geometries was tested in turbulence representative of the high frequency end of the spectrum in a simulated suburban atmospheric boundary layer. The mean normal, lateral and vertical lift force coefficients, in addition to the twisting moment coefficient and eccentricity ratio, were determined using the measured data for each model. Wind tunnel testing confirmed that drag on a prismatic VMS is smaller than the 1.7 suggested value in the current AASHTO Specification (2013). An alternative to the AASHTO Specification code value is presented in the form of a design matrix. Testing and analysis also indicated that vortex shedding oscillations and galloping instability could be significant for VMS signs with a large depth ratio attached to a structure with a low natural frequency. The effect of corner modification was investigated by testing models with chamfered and rounded corners. Results demonstrated an additional decrease in the drag coefficient but a possible Reynolds number dependency for the rounded corner configuration.
Resumo:
John le Carré’s novels “The Spy Who Came in From the Cold” (1963), “Tinker, Tailor, Soldier, Spy” (1974), and “The Tailor of Panama” (1997), focus on how the main characters reflect the somber reality of working in the British intelligence service. Through a broad post-structuralist analysis, I will identify the dichotomies - good/evil in “The Spy Who Came in From the Cold,” past/future in “Tinker, Tailor, Soldier, Spy,” and institution/individual in “The Tailor of Panama” - that frame the role of the protagonists. Each character is defined by his ambiguity and swinging moral compass, transforming him into a hybrid creation of morality and adaptability during transitional time periods in history, mainly during the Cold War. Le Carré’s novels reject the notion of spies standing above a group being celebrated. Instead, he portrays spies as characters who trade off individualism and social belonging for a false sense of heroism, loneliness, and even death.
Resumo:
The main objective of Leg 82 of the Glomar Challenger was to document mantle heterogeneity in the vicinity of, and away from, a so-called hot spot: the Azores Triple Junction. One of the geochemical tools that permits, at least in part, the recognition of mantle heterogeneities uses hygromagmaphile elements, those elements that have an affinity for the liquid. This tool is presented in terms of an extended Coryell-Masuda plot, which incorporates within the rare earth elements the hygromagmaphile transition elements Th, Ta, Zr, Hf, Ti, Y, and V. The extended Coryell-Masuda plot is used to summarize our knowledge of mantle heterogeneity along the ridge axis at zero-age. It is also used by choosing those hygromagmaphile elements that can be analyzed on board by X-ray fluorescence spectrometry to give preliminary information on the enriched or depleted character of recovered samples. Shore-based results, which include analyses of most of the hygromagmaphile elements measured either by X-ray spectrometry or neutron activation analysis, confirm the shipboard data. From the point of view of comparative geochemistry, the variety of basalts recovered during Leg 82 provides a good opportunity to test and verify the classification of the hygromagmaphile elements. Analyses from Leg 82 provide new data about the relationship between extended rare earth patterns (enriched or depleted) that can be estimated either by La/Sm ratio or Nb/Zr (or Ta/Hf) ratios: samples from Hole 556 are depleted (low Nb/Zr ratio) but have a high 206Pb/ 204Pb (19.5) ratio; in Hole 558 a moderately enriched basalt unit with a La/Sm (= Nb/Zr) ratio (chondrite normalized) of 2 has a high 206Pb/204Pb (20) ratio. One of the most interesting results of Leg 82 lies in the crossing patterns of extended Coryell-Masuda plots for basalts from the same hole. This result enhances the notion of local mantle heterogeneity versus regional mantle heterogeneity and is confirmed by isotope data; it also favors a model of short-lived, discrete magma chambers. The data tend to confirm the Hayes Fracture Zone as a southern limit for the influence of Azores-type mantle. Nevertheless, north of the Hayes Fracture Zone, the influence of a plumelike mantle source is not simple and probably requires an explanation more complex than a contribution from a single fixed hot spot.
Resumo:
Uk'37 sea-surface temperature (SST) estimates obtained at ~2.5-k.y. resolution from Ocean Drilling Program Site 1020 show glacial-interglacial cyclicity with an amplitude of 7°-10°C over the last 780 k.y. This record shows a similar pattern of variability to another alkenone-based SST record obtained previously from the Santa Barbara Basin. Both records show that oxygen isotope Stage (OIS) 5.5 was warmer by ~3°C relative to the present and that glacial Uk'37 temperatures warm in advance of deglaciation, as inferred from benthic d18O records. The alkenone-based SST record at Site 1020 is longer than previously published work along the California margin. We show that warmer than present interglacial stages have occurred frequently during the last 800 k.y. Alkenone concentrations, a proxy for coccolithophorid productivity, indicate that sedimentary marine organic carbon content has also varied significantly over this interval, with higher contents during interglacial periods. A baseline shift to warmer SST and greater alkenone content occurs before OIS 13. We compare our results with those from previous multiproxy studies in this region and conclude that SST has increased by ~5°C since the last glacial period (21 ka). Our data show that maximum alkenone SSTs occur simultaneously with minimum ice volume at Site 1020, which is consistent with data from farther south along the margin. The presence of sea ice in the glacial northeast Pacific, the extent of which is inferred from locations of ice-rafted debris, provides further support for our notion of cold surface water within the northern California Current system, averaging 7°-8°C cooler during peak glacial conditions. The cooling of surface water during glacial stages most likely did not result from enhanced upwelling because alkenone concentrations and terrestrial redwood pollen assemblages are consistently lower during glacial periods.
Resumo:
This paper defines the notion of key inventors — those whose patenting is simultaneously highly productive and also widely cited. By implication, key inventors should be the leaders in any developing new field and we investigate the validity of the notion through an exploration of two emerging technological fields: fuel cell and nanotechnology. The nature of the two groups is compared to discuss the differences between the technological groups.
Resumo:
We prove that a random Hilbert scheme that parametrizes the closed subschemes with a fixed Hilbert polynomial in some projective space is irreducible and nonsingular with probability greater than $0.5$. To consider the set of nonempty Hilbert schemes as a probability space, we transform this set into a disjoint union of infinite binary trees, reinterpreting Macaulay's classification of admissible Hilbert polynomials. Choosing discrete probability distributions with infinite support on the trees establishes our notion of random Hilbert schemes. To bound the probability that random Hilbert schemes are irreducible and nonsingular, we show that at least half of the vertices in the binary trees correspond to Hilbert schemes with unique Borel-fixed points.
Resumo:
There is a place where a Canadian citizen can be sent to 30 days detention, by someone who is not a judge, without being represented by counsel, and without having a meaningful right to appeal. It is the summary trial system of the Canadian Armed Forces. This thesis analyses that system and suggests reforms. It is aimed at those who have an interest in improving the administration of military justice at the unit level but want to sufficiently understand the issues before doing so. Through a classic legal approach with elements of legal history and comparative law, this study begins by setting military justice in the Canadian legal firmament. The introductory chapter also explains fundamental concepts, first and foremost the broader notion of discipline, for which summary trial is one of the last maintaining tools. Chapter II describes the current system. An overview of its historical background is first given. Then, each procedural step is demystified, from investigation until review. Chapter III identifies potential breaches of the Charter, highlighting those that put the system at greater constitutional risk: the lack of judicial independence, the absence of hearing transcript, the lack of legal representation and the disparity of treatment between ranks. Alternatives adopted in the Canadian Armed Forces and in foreign jurisdictions, from both common law and civil law traditions, in addressing similar challenges are reviewed in Chapter IV. Chapter V analyses whether the breaches could nevertheless be justified in a free and democratic society. Its conclusion is that, considering the availability of reasonable alternatives, it would be hard to convince a court that the current system is a legitimate impairment of the individual’s legal rights. The conclusion Chapter presents options to address current challenges. First, the approach of ‘depenalization’ taken by the Government in recent Bill C-71 is analysed and criticised. The ‘judicialization’ approach is advocated through a series of 16 recommendations designed not only to strengthen the constitutionality of the system but also to improve the administration of military justice in furtherance of service members’ legal rights.
Resumo:
Smart cities, cities that are supported by an extensive digital infrastructure of sensors, databases and intelligent applications, have become a major area of academic, governmental and public interest. Simultaneously, there has been a growing interest in open data, the unrestricted use of organizational data for public viewing and use. Drawing on Science and Technology Studies (STS), Urban Studies and Political Economy, this thesis examines how digital processes, open data and the physical world can be combined in smart city development, through the qualitative interview-based case study of a Southern Ontario Municipality, Anytown. The thesis asks what are the challenges associated with smart city development and open data proliferation, is open data complimentary to smart urban development; and how is expertise constructed in these fields? The thesis concludes that smart city development in Anytown is a complex process, involving a variety of visions, programs and components. Although smart city and open data initiatives exist in Anytown, and some are even overlapping and complementary, smart city development is in its infancy. However, expert informants remained optimistic, faithful to a technologically sublime vision of what a smart city would bring. The thesis also questions the notion of expertise within the context of smart city and open data projects, concluding that assertions of expertise need to be treated with caution and scepticism when considering how knowledge is received, generated, interpreted and circulates, within organizations.
Resumo:
This project is about Fast and Female, a community-based girls’ sport organization, that focuses on empowering girls through sport. In this thesis I produce a discourse analysis from interviews with six expert sportswomen and a textual analysis of the organization’s online content – including its social media pages. I ground my analysis in poststructural theory as explained by Chris Weedon (1997) and in literature that helps contextualize and better define empowerment (Collins, 2000; Cruikshank, 1999; Hains, 2012; Sharma, 2008; Simon, 1994) and neoliberalism (Silk & Andrews, 2012). My analysis in this project suggests that Fast and Female develops a community through online and in-person interaction. This community is focused on girls’ sport and empowerment, but, as the organization is situated in a neoliberal context, organizers must take extra consideration in order for the organization to develop a girls’ sport culture that is truly representative of the desires and needs of the participants rather than implicit neoliberal values. It is important to note that Fast and Female does not identify as a feminist organization. Through this thesis I argue that Fast and Female teaches girls that sport is empowering – but, while the organization draws on “empowerment,” a term often used by feminists, it promotes a notion of empowerment that teaches female athletes how to exist within current mainstream and sporting cultures, rather than encouraging them to be empowered female citizens who learn to question and challenge social inequity. I conclude my thesis with suggestions for Fast and Female to encourage empowerment in spite of the current neoliberal situation. I also offer a goal-setting workbook that I developed to encourage girls to set goals while thinking about their communities rather than just themselves.
Resumo:
The paper investigates how Information Systems (IS) has emerged as the product of inter-disciplinary discourses. The research aim in this study is to better understand diversity in IS research, and the extent to which the diversity of discourse expanded and contracted from 1995 to 2011. Methodologically, we apply a combined citations/co-citations analysis based on the eight Association for Information Systems basket journals and the 22 subject-field classification framework provided by the Association of Business Schools. Our findings suggest that IS is in a state of continuous interaction and competition with other disciplines. General Management was reduced from a dominant position as a reference discipline in IS at the expense of a growing variety of other discourses including Business Strategy, Marketing, and Ethics and Governance, among others. Over time, IS as a field moved from the periphery to a central position during its discursive formation. This supports the notion of IS as a fluid discipline dynamically embracing a diverse range of adjacent reference disciplines, while keeping a degree of continuing interaction with them. Understanding where IS is currently at allows us to better understand and propose fruitful avenues for its development in both academia and practice. © 2013 JIT Palgrave Macmillan All rights reserved.
Resumo:
This dissertation offers a critical international political economy (IPE) analysis of the ways in which consumer information has been governed throughout the formal history of consumer finance (1840 – present). Drawing primarily on the United States, this project problematizes the notion of consumer financial big data as a ‘new era’ by tracing its roots historically from late nineteenth century through to the present. Using a qualitative case study approach, this project applies a unique theoretical framework to three instances of governance in consumer credit big data. Throughout, the historically specific means used to govern consumer credit data are rooted in dominant ideas, institutions and material factors.