508 resultados para Claudio de Turín
Resumo:
In our rejoinder to Don Weatherburn's paper, “Law and Order Blues”, we do not take issue with his advocacy of the need to take crime seriously and to foster a more rational approach to the problems it poses. Where differences do emerge is (1) with his claim that he is willing to do so whilst we (in our different ways) are not; and (2) on the question of what this involves. Of particular concern is the way in which his argument proceeds by a combination of simple misrepresentation of the positions it seeks to disparage, and silence concerning issues of real substance where intellectual debate and exchange would be welcome and useful. Our paper challenges, in turn, the misrepresentation of Indermaur's analysis of trends in violent crime, the misrepresentation of Hogg and Brown's Rethinking Law and Order, the misrepresentation of the findings of some of the research into the effectiveness of punitive policies and the silence on sexual assault in “Law and Order Blues”. We suggest that his silence on sexual assault reflects a more widespread unwillingness to acknowledge the methodological problems that arise in the measurement of crime because such problems severely limit the extent to which confident assertions can be made about prevalence and trends.
Resumo:
Building on and bringing up to date the material presented in the first installment of Directory of World Cinema : Australia and New Zealand, this volume continues the exploration of the cinema produced in Australia and New Zealand since the beginning of the twentieth century. Among the additions to this volume are in-depth treatments of the locations that feature prominently in the countries' cinema. Essays by leading critics and film scholars consider the significance in films of the outback and the beach, which is evoked as a liminal space in Long Weekend and a symbol of death in Heaven's Burning, among other films. Other contributions turn the spotlight on previously unexplored genres and key filmmakers, including Jane Campion, Rolf de Heer, Charles Chauvel, and Gillian Armstrong.
Resumo:
This chapter begins with a discussion of the economic, political, and social context of the recent global financial crisis, which casts into relief current boundaries of criminology, permeated and made fluid in criminology's recent cultural turn. This cultural turn has reinvigorated criminology, providing new objects of analysis and rich and thick descriptions of the relationship between criminal justice and the conditions of life in ‘late modernity’. Yet in comparison with certain older traditions that sought to articulate criminal justice issues with a wider politics of contestation around political economies and social welfare policies of different polities, many of the current leading culturalist accounts tend in their globalized convergences to produce a strangely decontextualized picture in which we are all subject to the zeitgeist of a unitary ‘late modernity’ which does not differ between, for example, social democratic and neo-liberal polities, let alone allow for the widespread persistence of the pre-modern. It is argued that that contrary to this globalizing trend there are signs within criminology that life is being breathed back into social democratic and penal welfare concerns, habitus, and practices. The chapter discusses three of these signs: the emergence of neo-liberalism as a subject of criminology; a developing comparative penology which recognizes differences in the political economies of capitalist states and evinces a renewed interest in inequality; and a nascent revolt against the ‘generative grammar’, ‘pathological disciplinarities’, and ‘imaginary penalities’ of neoliberal managerialism.
Resumo:
It has become more and more demanding to investigate the impacts of wind farms on power system operation as ever-increasing penetration levels of wind power have the potential to bring about a series of dynamic stability problems for power systems. This paper undertakes such an investigation through investigating the small signal and transient stabilities of power systems that are separately integrated with three types of wind turbine generators (WTGs), namely the squirrel cage induction generator (SCIG), the doubly fed induction generator (DFIG), and the permanent magnet generator (PMG). To examine the effects of these WTGs on a power system with regard to its stability under different operating conditions, a selected synchronous generator (SG) of the well-known Western Electricity Coordinating Council (WECC three-unit nine-bus system and an eight-unit 24-bus system is replaced in turn by each type of WTG with the same capacity. The performances of the power system in response to the disturbances are then systematically compared. Specifically, the following comparisons are undertaken: (1) performances of the power system before and after the integration of the WTGs; and (2) performances of the power system and the associated consequences when the SCIG, DFIG, or PMG are separately connected to the system. These stability case studies utilize both eigenvalue analysis and dynamic time-domain simulation methods.
Resumo:
This study attempts to develop a better understanding of the challenges of knowledge integration (KI) within the innovation process in Small and Medium Enterprises (SMEs). Using several case studies, this study investigates how knowledge integration may be managed within the context of innovation in SMEs. The research places particular focus on identifying the challenges of knowledge integration in SMEs in relation to three aspects of knowledge integration activities, namely knowledge identification, knowledge acquisition, and knowledge sharing. Four distinct tasks emerged in the knowledge integration process, namely team building capability, capturing tacit knowledge, role of knowledge management (KM) systems, and technological systemic integration. The paper suggests that managing knowledge integration in SMEs can be best managed by focusing on these four tasks, which in turn will lead to innovation.
Resumo:
2012 saw the publication of competing and complementary lines of Australian “classics”: “A&R Australian Classics” (HarperCollins) and “Text Classics” (Text Publishing). While Angus and Robertson were key in establishing a canon of Australian children’s classics in the twentieth century, it was the Text Classics line which included a selection of young people’s titles in their 2013. In turn, Penguin Australia launched a selection of “Australian Children’s Classics”. In so doing, these publishers were drawing on particular literary and visual cultural traditions in Australian children’s literature. Public assertions of a particular selection of children’s books reveals not only contemporary assumptions about desirable childhood experiences but about the operation of nostalgia therein. In encouraging Australian adults to judge books by their covers, such gestures imply that Australian children may be similarly understood. Importantly, the illusion of unity, sameness, and legibility which is promised by circumscribed canons of “classic” children’s literature may well imply a desire for similarly illusory, unified, legible, “classic” childhood. This paper attends to public attempts to materialise (and legitimise) a canon of classic Australian children’s literature. In particular, it considers the ways in which publishing, postage stamps, and book awards make visible a range of children’s books, but do so in order to either fix or efface the content or meaning of the books themselves. Moving between assertions of the best books for children from the 1980s to today, and of the social values circulated within those books, this paper considers the possibilities and problematics of an Australian children’s canon.
Resumo:
In the first Modern Language Association newsletter for 2006, renowned poetry critic and MLA President, Marjorie Perloff, remarked on the growing ascendency of Creative Writing within English Studies in North America. In her column, Perloff notes that "[i]n studying the English Job Information List (JIL) so as to advise my own students and others I know currently on the market, I noticed what struck me as a curious trend: there are, in 2005, almost three times as many positions in creative writing as in the study of twentieth-century literature" (3). The dominance of Creative Writing in the English Studies job list in turn reflects the growing student demand for undergraduate and postgraduate degrees in the field—over the past 20 years, BA and MA degrees in Creative Writing in North American tertiary institutions have quadrupled (3)...
Resumo:
In this thesis we investigate the use of quantum probability theory for ranking documents. Quantum probability theory is used to estimate the probability of relevance of a document given a user's query. We posit that quantum probability theory can lead to a better estimation of the probability of a document being relevant to a user's query than the common approach, i. e. the Probability Ranking Principle (PRP), which is based upon Kolmogorovian probability theory. Following our hypothesis, we formulate an analogy between the document retrieval scenario and a physical scenario, that of the double slit experiment. Through the analogy, we propose a novel ranking approach, the quantum probability ranking principle (qPRP). Key to our proposal is the presence of quantum interference. Mathematically, this is the statistical deviation between empirical observations and expected values predicted by the Kolmogorovian rule of additivity of probabilities of disjoint events in configurations such that of the double slit experiment. We propose an interpretation of quantum interference in the document ranking scenario, and examine how quantum interference can be effectively estimated for document retrieval. To validate our proposal and to gain more insights about approaches for document ranking, we (1) analyse PRP, qPRP and other ranking approaches, exposing the assumptions underlying their ranking criteria and formulating the conditions for the optimality of the two ranking principles, (2) empirically compare three ranking principles (i. e. PRP, interactive PRP, and qPRP) and two state-of-the-art ranking strategies in two retrieval scenarios, those of ad-hoc retrieval and diversity retrieval, (3) analytically contrast the ranking criteria of the examined approaches, exposing similarities and differences, (4) study the ranking behaviours of approaches alternative to PRP in terms of the kinematics they impose on relevant documents, i. e. by considering the extent and direction of the movements of relevant documents across the ranking recorded when comparing PRP against its alternatives. Our findings show that the effectiveness of the examined ranking approaches strongly depends upon the evaluation context. In the traditional evaluation context of ad-hoc retrieval, PRP is empirically shown to be better or comparable to alternative ranking approaches. However, when we turn to examine evaluation contexts that account for interdependent document relevance (i. e. when the relevance of a document is assessed also with respect to other retrieved documents, as it is the case in the diversity retrieval scenario) then the use of quantum probability theory and thus of qPRP is shown to improve retrieval and ranking effectiveness over the traditional PRP and alternative ranking strategies, such as Maximal Marginal Relevance, Portfolio theory, and Interactive PRP. This work represents a significant step forward regarding the use of quantum theory in information retrieval. It demonstrates in fact that the application of quantum theory to problems within information retrieval can lead to improvements both in modelling power and retrieval effectiveness, allowing the constructions of models that capture the complexity of information retrieval situations. Furthermore, the thesis opens up a number of lines for future research. These include: (1) investigating estimations and approximations of quantum interference in qPRP; (2) exploiting complex numbers for the representation of documents and queries, and; (3) applying the concepts underlying qPRP to tasks other than document ranking.
Resumo:
The purpose of this paper is to describe a new decomposition construction for perfect secret sharing schemes with graph access structures. The previous decomposition construction proposed by Stinson is a recursive method that uses small secret sharing schemes as building blocks in the construction of larger schemes. When the Stinson method is applied to the graph access structures, the number of such “small” schemes is typically exponential in the number of the participants, resulting in an exponential algorithm. Our method has the same flavor as the Stinson decomposition construction; however, the linear programming problem involved in the construction is formulated in such a way that the number of “small” schemes is polynomial in the size of the participants, which in turn gives rise to a polynomial time construction. We also show that if we apply the Stinson construction to the “small” schemes arising from our new construction, both have the same information rate.
Resumo:
Heparan sulfate proteoglycans (HSPGs) are complex and labile macromolecular moieties on the surfaces of cells that control the activities of a range of extracellular proteins, particularly those driving growth and regeneration. Here, we examine the biosynthesis of heparan sulfate (HS) sugars produced by cultured MC3T3-E1 mouse calvarial pre-osteoblast cells in order to explore the idea that changes in HS activity in turn drive phenotypic development during osteogenesis. Cells grown for 5 days under proliferating conditions were compared to cells grown for 20 days under mineralizing conditions with respect to their phenotype, the forms of HS core protein produced, and their HS sulfotransferase biosynthetic enzyme levels. RQ-PCR data was supported by the results from the purification of day 5 and day 20 HS forms by anionic exchange chromatography. The data show that cells in active growth phases produce more complex forms of sugar than cells that have become relatively quiescent during active mineralization, and that these in turn can differentially influence rates of cell growth when added exogenously back to preosteoblasts.
Resumo:
We present a novel approach for multi-object detection in aerial videos based on tracking. The proposed method mainly involves three steps. Firstly, the spatial-temporal saliency is employed to detect moving objects. Secondly, the detected objects are tracked by mean shift in the subsequent frames. Finally, the saliency results are fused with the weight map generated by tracking to get refined detection results, and in turn the modified detection results are used to update the tracking models. The proposed algorithm is evaluated on VIVID aerial videos, and the results show that our approach can reliably detect moving objects even in challenging situations. Meanwhile, the proposed method can process videos in real time, without the effect of time delay.
Resumo:
Measuring Earth material behaviour on time scales of millions of years transcends our current capability in the laboratory. We review an alternative path considering multiscale and multiphysics approaches with quantitative structure-property relationships. This approach allows a sound basis to incorporate physical principles such as chemistry, thermodynamics, diffusion and geometry-energy relations into simulations and data assimilation on the vast range of length and time scales encountered in the Earth. We identify key length scales for Earth systems processes and find a substantial scale separation between chemical, hydrous and thermal diffusion. We propose that this allows a simplified two-scale analysis where the outputs from the micro-scale model can be used as inputs for meso-scale simulations, which then in turn becomes the micro-model for the next scale up. We present two fundamental theoretical approaches to link the scales through asymptotic homogenisation from a macroscopic thermodynamic view and percolation renormalisation from a microscopic, statistical mechanics view.
Resumo:
Geoscientists are confronted with the challenge of assessing nonlinear phenomena that result from multiphysics coupling across multiple scales from the quantum level to the scale of the earth and from femtoseconds to the 4.5 Ga of history of our planet. We neglect in this review electromagnetic modelling of the processes in the Earth’s core, and focus on four types of couplings that underpin fundamental instabilities in the Earth. These are thermal (T), hydraulic (H), mechanical (M) and chemical (C) processes which are driven and controlled by the transfer of heat to the Earth’s surface. Instabilities appear as faults, folds, compaction bands, shear/fault zones, plate boundaries and convective patterns. Convective patterns emerge from buoyancy overcoming viscous drag at a critical Rayleigh number. All other processes emerge from non-conservative thermodynamic forces with a critical critical dissipative source term, which can be characterised by the modified Gruntfest number Gr. These dissipative processes reach a quasi-steady state when, at maximum dissipation, THMC diffusion (Fourier, Darcy, Biot, Fick) balance the source term. The emerging steady state dissipative patterns are defined by the respective diffusion length scales. These length scales provide a fundamental thermodynamic yardstick for measuring instabilities in the Earth. The implementation of a fully coupled THMC multiscale theoretical framework into an applied workflow is still in its early stages. This is largely owing to the four fundamentally different lengths of the THMC diffusion yardsticks spanning micro-metre to tens of kilometres compounded by the additional necessity to consider microstructure information in the formulation of enriched continua for THMC feedback simulations (i.e., micro-structure enriched continuum formulation). Another challenge is to consider the important factor time which implies that the geomaterial often is very far away from initial yield and flowing on a time scale that cannot be accessed in the laboratory. This leads to the requirement of adopting a thermodynamic framework in conjunction with flow theories of plasticity. This framework allows, unlike consistency plasticity, the description of both solid mechanical and fluid dynamic instabilities. In the applications we show the similarity of THMC feedback patterns across scales such as brittle and ductile folds and faults. A particular interesting case is discussed in detail, where out of the fluid dynamic solution, ductile compaction bands appear which are akin and can be confused with their brittle siblings. The main difference is that they require the factor time and also a much lower driving forces to emerge. These low stress solutions cannot be obtained on short laboratory time scales and they are therefore much more likely to appear in nature than in the laboratory. We finish with a multiscale description of a seminal structure in the Swiss Alps, the Glarus thrust, which puzzled geologists for more than 100 years. Along the Glarus thrust, a km-scale package of rocks (nappe) has been pushed 40 km over its footwall as a solid rock body. The thrust itself is a m-wide ductile shear zone, while in turn the centre of the thrust shows a mm-cm wide central slip zone experiencing periodic extreme deformation akin to a stick-slip event. The m-wide creeping zone is consistent with the THM feedback length scale of solid mechanics, while the ultralocalised central slip zones is most likely a fluid dynamic instability.
Resumo:
Enterprise resource planning (ERP) systems are rapidly being combined with “big data” analytics processes and publicly available “open data sets”, which are usually outside the arena of the enterprise, to expand activity through better service to current clients as well as identifying new opportunities. Moreover, these activities are now largely based around relevant software systems hosted in a “cloud computing” environment. However, the over 50- year old phrase related to mistrust in computer systems, namely “garbage in, garbage out” or “GIGO”, is used to describe problems of unqualified and unquestioning dependency on information systems. However, a more relevant GIGO interpretation arose sometime later, namely “garbage in, gospel out” signifying that with large scale information systems based around ERP and open datasets as well as “big data” analytics, particularly in a cloud environment, the ability to verify the authenticity and integrity of the data sets used may be almost impossible. In turn, this may easily result in decision making based upon questionable results which are unverifiable. Illicit “impersonation” of and modifications to legitimate data sets may become a reality while at the same time the ability to audit any derived results of analysis may be an important requirement, particularly in the public sector. The pressing need for enhancement of identity, reliability, authenticity and audit services, including naming and addressing services, in this emerging environment is discussed in this paper. Some current and appropriate technologies currently being offered are also examined. However, severe limitations in addressing the problems identified are found and the paper proposes further necessary research work for the area. (Note: This paper is based on an earlier unpublished paper/presentation “Identity, Addressing, Authenticity and Audit Requirements for Trust in ERP, Analytics and Big/Open Data in a ‘Cloud’ Computing Environment: A Review and Proposal” presented to the Department of Accounting and IT, College of Management, National Chung Chen University, 20 November 2013.)
Resumo:
"In this chapter the authors present a critique of Participatory Evaluation as worked in development projects, in this case, in Nepal. The article works between established claims that Participatory Evaluation builds capacity at programmatic and organisational levels, and the specific experiences of these claims in the authors’ current work. They highlight the need to address key difficulties such as high turn-over of staff and resulting loss of capacity to engage in Participatory Evaluation, and the difficulty of communication between academic as compared with local practical wisdoms. A key issue is the challenge of addressing the inevitable issues of power inequities that such approaches encounter. While Participatory Evaluation has been around for some time, it has only enjoyed more widespread recognition of its value in comparatively recent times, with its uptake in international development environments. To this extent, the practice is still in its early stages of development, and Jo, June and Michael’s work contributes to strengthening and more comprehensively understanding it. With regard to the meta-theme of this publication, this chapter is an example of how context not only influences the methodology to be used and the praxis of how it is to be used, but contributes to early explication of the core nature of an emerging methodology."