573 resultados para critical state
Resumo:
PySSM is a Python package that has been developed for the analysis of time series using linear Gaussian state space models (SSM). PySSM is easy to use; models can be set up quickly and efficiently and a variety of different settings are available to the user. It also takes advantage of scientific libraries Numpy and Scipy and other high level features of the Python language. PySSM is also used as a platform for interfacing between optimised and parallelised Fortran routines. These Fortran routines heavily utilise Basic Linear Algebra (BLAS) and Linear Algebra Package (LAPACK) functions for maximum performance. PySSM contains classes for filtering, classical smoothing as well as simulation smoothing.
Resumo:
One of the promises of New Labour was that government policy would be grounded in 'evidence based research'. In recent years some academics have come to question whether the government has delivered on this promise. Professors Reece Walters and Tim Hope offer two contributions to this debate, arguing that rather than the 'evidence base', it is political considerations that govern the commissioning, production and dissemination of Home Office research. As the first monograph in our 'Evidence based policy series' Critical thinking about the uses of research carries a thought provoking set of arguments.
Resumo:
This work offers a critical introduction to sociology for New Zealand students. Written in an accessible narrative style, it seeks to challenge and debunk students' assumptions about key elements of their social worlds, encouraging them to develop a "critical imagination" as a tool to identify broader social themes in personal issues.
Resumo:
This article presents a critical analysis of the current and proposed CCS legal frameworks across a number of jurisdictions in Australia in order to examine the legal treatment of the risks of carbon leakage from CCS operations. It does so through an analysis of the statutory obligations and liability rules established under the offshore Commonwealth and Victorian regimes, and onshore Queensland and Victorian legislative frameworks. Exposure draft legislation for CCS laws in Western Australia is also examined. In considering where the losses will fall in the event of leakage, the potential tortious and statutory liabilities of private operators and the State are addressed alongside the operation of statutory protections from liability. The current legal treatment of CCS under the new Australian Carbon Pricing Mechanism is also critiqued.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
In recent times, light gauge steel framed (LSF) structures, such as cold-formed steel wall systems, are increasingly used, but without a full understanding of their fire performance. Traditionally the fire resistance rating of these load-bearing LSF wall systems is based on approximate prescriptive methods developed based on limited fire tests. Very often they are limited to standard wall configurations used by the industry. Increased fire rating is provided simply by adding more plasterboards to these walls. This is not an acceptable situation as it not only inhibits innovation and structural and cost efficiencies but also casts doubt over the fire safety of these wall systems. Hence a detailed fire research study into the performance of LSF wall systems was undertaken using full scale fire tests and extensive numerical studies. A new composite wall panel developed at QUT was also considered in this study, where the insulation was used externally between the plasterboards on both sides of the steel wall frame instead of locating it in the cavity. Three full scale fire tests of LSF wall systems built using the new composite panel system were undertaken at a higher load ratio using a gas furnace designed to deliver heat in accordance with the standard time temperature curve in AS 1530.4 (SA, 2005). Fire tests included the measurements of load-deformation characteristics of LSF walls until failure as well as associated time-temperature measurements across the thickness and along the length of all the specimens. Tests of LSF walls under axial compression load have shown the improvement to their fire performance and fire resistance rating when the new composite panel was used. Hence this research recommends the use of the new composite panel system for cold-formed LSF walls. The numerical study was undertaken using a finite element program ABAQUS. The finite element analyses were conducted under both steady state and transient state conditions using the measured hot and cold flange temperature distributions from the fire tests. The elevated temperature reduction factors for mechanical properties were based on the equations proposed by Dolamune Kankanamge and Mahendran (2011). These finite element models were first validated by comparing their results with experimental test results from this study and Kolarkar (2010). The developed finite element models were able to predict the failure times within 5 minutes. The validated model was then used in a detailed numerical study into the strength of cold-formed thin-walled steel channels used in both the conventional and the new composite panel systems to increase the understanding of their behaviour under nonuniform elevated temperature conditions and to develop fire design rules. The measured time-temperature distributions obtained from the fire tests were used. Since the fire tests showed that the plasterboards provided sufficient lateral restraint until the failure of LSF wall panels, this assumption was also used in the analyses and was further validated by comparison with experimental results. Hence in this study of LSF wall studs, only the flexural buckling about the major axis and local buckling were considered. A new fire design method was proposed using AS/NZS 4600 (SA, 2005), NAS (AISI, 2007) and Eurocode 3 Part 1.3 (ECS, 2006). The importance of considering thermal bowing, magnified thermal bowing and neutral axis shift in the fire design was also investigated. A spread sheet based design tool was developed based on the above design codes to predict the failure load ratio versus time and temperature for varying LSF wall configurations including insulations. Idealised time-temperature profiles were developed based on the measured temperature values of the studs. This was used in a detailed numerical study to fully understand the structural behaviour of LSF wall panels. Appropriate equations were proposed to find the critical temperatures for different composite panels, varying in steel thickness, steel grade and screw spacing for any load ratio. Hence useful and simple design rules were proposed based on the current cold-formed steel structures and fire design standards, and their accuracy and advantages were discussed. The results were also used to validate the fire design rules developed based on AS/NZS 4600 (SA, 2005) and Eurocode Part 1.3 (ECS, 2006). This demonstrated the significant improvements to the design method when compared to the currently used prescriptive design methods for LSF wall systems under fire conditions. In summary, this research has developed comprehensive experimental and numerical thermal and structural performance data for both the conventional and the proposed new load bearing LSF wall systems under standard fire conditions. Finite element models were developed to predict the failure times of LSF walls accurately. Idealized hot flange temperature profiles were developed for non-insulated, cavity and externally insulated load bearing wall systems. Suitable fire design rules and spread sheet based design tools were developed based on the existing standards to predict the ultimate failure load, failure times and failure temperatures of LSF wall studs. Simplified equations were proposed to find the critical temperatures for varying wall panel configurations and load ratios. The results from this research are useful to both structural and fire engineers and researchers. Most importantly, this research has significantly improved the knowledge and understanding of cold-formed LSF loadbearing walls under standard fire conditions.
Resumo:
The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.
Resumo:
As 2001 was the International Year of the Volunteer as it seemed timely to look at the legal, social and political frameworks which provide for the long term growth of volunteers. The focus of this research is on the nature and extent of volunteers in the Queensland State Government. The social capital debate (expanded by Robert Putnam in 1995) is about citizens’ participation in extracurricular activities and has been extended to mean a collective intelligence – a capacity as a people to create the society we want. The volunteer phenomenon has been used to indicate social and ethical concern.
Resumo:
It appears that few of the students holding ‘socially idealistic’ goals upon entering law school actually maintain these upon graduation. The critical legal narrative, which explains and seeks to act upon this shift in the graduate’s ‘legal identity’, posits that these ideals are repressed through power relations that create passive receptacles into which professional ideologies can be deposited, in the interests of those advantaged by the social and legal status quo. Using the work of Michel Foucault, this paper unpacks the assumptions underpinning this narrative, particularly its arguments about ideology, power, and the subject. In doing so, it will argue this narrative provides an untenable basis for political action within legal education. By interrogating this narrative, this paper provides a new way of understanding the construction of the legal identity through legal education, and a new basis for political action within law school.
Resumo:
This article explores power within legal education scholarship. It suggests that power relations are not effectively reflected on within this scholarship, and it provokes legal educators to consider power more explicitly and effectively. It then outlines in-depth a conceptual and methodological approach based on Michel Foucault’s concept of ‘governmentality’ to assist in such an analysis. By detailing the conceptual moves required in order to research power in legal education more effectively, this article seeks to stimulate new reflection and thought about the practice and scholarship of legal education, and allow for political interventions to become more ethically sensitive and potentially more effective.
Resumo:
A novel concept of producing high dc voltage for pulsed-power applications is proposed in this paper. The topology consists of an LC resonant circuit supplied through a tuned alternating waveform that is produced by an inverter. The control scheme is based on the detection of variations in the resonant frequency and adjustment of the switching signal patterns for the inverter to produce a square waveform with exactly the same frequencies. Therefore the capacitor voltage oscillates divergently with an increasing amplitude. A simple one-stage capacitor-diode voltage multiplier (CDVM) connected to the resonant capacitor then rectifies the alternating voltage and gives a dc level equal to twice the input voltage amplitude. The produced high voltage appears then in the form of high-voltage pulses across the load. A basic model is simulated by Simulink platform of MATLAB and the results are included in the paper.
Resumo:
'A Simple Plan' is a deceptively complex and multilayered film, combining elements of Celtic mythology with the morality play and the windfall fantasy gone disastrously wrong. Despite its blending of realism and heavyhanded symbolism, and its abundant trans-textual gestures, 'A Simple Plan' is in many ways defiantly not a 90s movie: its leading characters are fashionably flawed, but they are neither sensitive, nor honourable, nor heroic; there are no startling special effects or intricate timeshifts; and it desperately gives the impression of depth, of being emphatically more than mere superficial excess. At a stretch it almost appears to be a throwback to the 1930s Production Code emphasis on the role of cinema in moral instruction; while good hardly triumphs over evil, venality is painfully and emphatically punished. But in other ways it is a quintessential late 90s film: an American/British/Japanese/German/French co-production, 'A Simple Plan' acts most palpably as a commentary on the moral, economic and social condition of the United States at the end of the American century.
Resumo:
Those working in the critical criminology tradition have been centrally concerned with the social construction, variability and contingency of the criminal label. The concern is no less salient to a consideration of critical criminology itself and any history of critical criminology (in Australia or elsewhere) should aim itself to be critical in this sense. The point applies with equal force to both of the terms ‘critical’ and ‘criminology’. The want of a stable theoretical object has meant that criminology itself needs to be seen not as a distinct discipline but as a composite intellectual and governmental hybrid, a field of studies that overlaps and intersects many others (sociology, law, psychology, history, anthropology, social work, media studies and youth studies to name only a few). In consequence, much of the most powerful work on subjects of criminological inquiry is undertaken by scholars who do not necessarily define themselves as criminologists first and foremost, or at all. For reasons that should later become obvious this is even more pronounced in the Australian context. Although we may appear at times to be claiming such work for criminology, our purpose is to recognize its impact on and in critical criminology in Australia.
Resumo:
Australia is currently in the midst of a major resources boom. However the benefits from the boom are unevenly distributed, with state governments collecting billions in royalties, and mining companies billions in profits. The costs are borne mostly at a local level by regional communities on the frontier of the mining boom, surrounded by thousands of men housed in work camps. The escalating reliance on non–resident workers housed in camps carries significant risks for individual workers, host communities and the provision of human services and infrastructure. These include rising rates of fatigue–related death and injuries, rising levels of alcohol–fuelled violence, illegally erected and unregulated work camps, soaring housing costs and other costs of living, and stretched basic infrastructure undermining the sustainability of these towns. But these costs have generally escaped industry, government and academic scrutiny. This chapter directs a critical gaze at the hopelessly compromised industry–funded research vital to legitimating the resource sector’s self–serving knowledge claims that it is committed to social sustainability and corporate responsibility. The chapter divides into two parts. The first argues that post–industrial mining regimes mask and privatise these harms and risks, shifting them on to workers, families and communities. The second part links the privatisation of these risks with the political economy of privatised knowledge embedded in the approvals process for major resource sector projects.
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.