972 resultados para critical pressure
Resumo:
Seat pressure is known as a major factor of seat comfort in vehicles. In passenger vehicles, there is lacking research into the seat comfort of rear seat occupants. As accurate seat pressure measurement requires significant effort, simulation of seat pressure is evolving as a preferred method. However, analytic methods are based on complex finite element modeling and therefore are time consuming and involve high investment. Based on accurate anthropometric measurements of 64 male subjects and outboard rear seat pressure measurements in three different passenger vehicles, this study investigates if a set of parameters derived from seat pressure mapping are sensitive enough to differentiate between different seats and whether they correlate with anthropometry in linear models. In addition to the pressure map analysis, H-Points were measured with a coordinate measurement system based on palpated body landmarks and the range of H-Point locations in the three seats is provided. It was found that for the cushion, cushion contact area and cushion front area/force could be modeled by subject anthropometry,while only seatback contact area could be modeled based on anthropometry for all three vehicles. Major differences were found between the vehicles for other parameters.
Resumo:
Bedsores (ulcers) are caused by multiple factors which include, but are not limited to; pressure, shear force, friction, temperature, age and medication. Specialised support services, such as specialised mattresses, sheepskin coverings etc., are thought to decrease or relieve pressure, resulting in a lowering of pressure ulcer incidence [3]. The primary aim of this study was to compare the upper/central body pressure distribution between normal lying in a hospital bed versus the use of a pressure redistribution belt. The study involved 16 healthy voluntary subjects lying on a hospital bed with and without wearing the belt. Results showed that the use of a pressure redistribution belt results in reduced pressure peaks and prevents the pressure from increasing over time.
Resumo:
Purpose: To assess the accuracy of intraocular pressure(IOP) measurements using rebound tonometry over disposable hydrogel (etafilcon A) and silicone hydrogel (senofilcon A) contact lenses (CLs) of different powers. Methods: The experimental group comprised 36 subjects (19 male, 17 female). IOP measurements were undertaken on the subject’s right eyes in random order using a rebound tonometer (ICare). The CLs had powers of +2.00D, −2.00D and−6.00D. Six measurements were taken over each contact lens and also before and after the CLs had been worn. Results: A good correlation was found between IOP measurements with and without CLs (all r≥0.80; p < 0.05). Bland Altman plots did not show any significant trend in the difference in IOP readings with and without CLs as a function of IOP value. A two-way ANOVA revealed a significant effect of material and power (p < 0.01) but no interaction. All the comparisons between the measurements without CLs and with hydrogel CLs were significant (p < 0.01). The comparisons with silicone hydrogel CLs were not significant. Conclusions: Rebound tonometry can be reliably performed over silicone hydrogel CLs. With hydrogel CLs, the measurements were lower than those without CLs. However, despite the fact that these differences were statistically significant, their clinical significance was minimal.
Resumo:
Performance of locomotor pointing tasks (goal-directed locomotion) in sport is typically constrained by dynamic factors, such as positioning of opponents and objects for interception. In the team sport of association football, performers have to coordinate their gait with ball displacement when dribbling and when trying to prevent opponent interception when running to kick a ball. This thesis comprises two studies analysing the movement patterns during locomotor pointing of eight experienced youth football players under static and dynamic constraints by manipulating levels of ball displacement (ball stationary or moving) and defensive pressure (defenders absent, or positioned near or far during performance). ANOVA with repeated measures was used to analyse effects of these task constraints on gait parameters during the run-up and cross performance sub-phase. Experiment 1 revealed outcomes consistent with previous research on locomotor pointing. When under defensive pressure, participants performed the run-up more quickly, concurrently modifying footfall placements relative to the ball location over trials. In experiment 2 players coordinated their gait relative to a moving ball significantly differently when under defensive pressure. Despite no specific task instructions being provided beforehand, context dependent constraints interacted to influence footfall placements over trials and running velocity of participants in different conditions. Data suggest that coaches need to manipulate task constraints carefully to facilitate emergent movement behaviours during practice in team games like football.
Resumo:
One of the promises of New Labour was that government policy would be grounded in 'evidence based research'. In recent years some academics have come to question whether the government has delivered on this promise. Professors Reece Walters and Tim Hope offer two contributions to this debate, arguing that rather than the 'evidence base', it is political considerations that govern the commissioning, production and dissemination of Home Office research. As the first monograph in our 'Evidence based policy series' Critical thinking about the uses of research carries a thought provoking set of arguments.
Resumo:
This work offers a critical introduction to sociology for New Zealand students. Written in an accessible narrative style, it seeks to challenge and debunk students' assumptions about key elements of their social worlds, encouraging them to develop a "critical imagination" as a tool to identify broader social themes in personal issues.
Resumo:
Radial Hele-Shaw flows are treated analytically using conformal mapping techniques. The geometry of interest has a doubly-connected annular region of viscous fluid surrounding an inviscid bubble that is either expanding or contracting due to a pressure difference caused by injection or suction of the inviscid fluid. The zero-surface-tension problem is ill-posed for both bubble expansion and contraction, as both scenarios involve viscous fluid displacing inviscid fluid. Exact solutions are derived by tracking the location of singularities and critical points in the analytic continuation of the mapping function. We show that by treating the critical points, it is easy to observe finite-time blow-up, and the evolution equations may be written in exact form using complex residues. We present solutions that start with cusps on one interface and end with cusps on the other, as well as solutions that have the bubble contracting to a point. For the latter solutions, the bubble approaches an ellipse in shape at extinction.
Resumo:
Existing secure software development principles tend to focus on coding vulnerabilities, such as buffer or integer overflows, that apply to individual program statements, or issues associated with the run-time environment, such as component isolation. Here we instead consider software security from the perspective of potential information flow through a program’s object-oriented module structure. In particular, we define a set of quantifiable "security metrics" which allow programmers to quickly and easily assess the overall security of a given source code program or object-oriented design. Although measuring quality attributes of object-oriented programs for properties such as maintainability and performance has been well-covered in the literature, metrics which measure the quality of information security have received little attention. Moreover, existing securityrelevant metrics assess a system either at a very high level, i.e., the whole system, or at a fine level of granularity, i.e., with respect to individual statements. These approaches make it hard and expensive to recognise a secure system from an early stage of development. Instead, our security metrics are based on well-established compositional properties of object-oriented programs (i.e., data encapsulation, cohesion, coupling, composition, extensibility, inheritance and design size), combined with data flow analysis principles that trace potential information flow between high- and low-security system variables. We first define a set of metrics to assess the security quality of a given object-oriented system based on its design artifacts, allowing defects to be detected at an early stage of development. We then extend these metrics to produce a second set applicable to object-oriented program source code. The resulting metrics make it easy to compare the relative security of functionallyequivalent system designs or source code programs so that, for instance, the security of two different revisions of the same system can be compared directly. This capability is further used to study the impact of specific refactoring rules on system security more generally, at both the design and code levels. By measuring the relative security of various programs refactored using different rules, we thus provide guidelines for the safe application of refactoring steps to security-critical programs. Finally, to make it easy and efficient to measure a system design or program’s security, we have also developed a stand-alone software tool which automatically analyses and measures the security of UML designs and Java program code. The tool’s capabilities are demonstrated by applying it to a number of security-critical system designs and Java programs. Notably, the validity of the metrics is demonstrated empirically through measurements that confirm our expectation that program security typically improves as bugs are fixed, but worsens as new functionality is added.
Resumo:
The concept of local accumulation time (LAT) was introduced by Berezhkovskii and coworkers in 2010–2011 to give a finite measure of the time required for the transient solution of a reaction–diffusion equation to approach the steady–state solution (Biophys J. 99, L59 (2010); Phys Rev E. 83, 051906 (2011)). Such a measure is referred to as a critical time. Here, we show that LAT is, in fact, identical to the concept of mean action time (MAT) that was first introduced by McNabb in 1991 (IMA J Appl Math. 47, 193 (1991)). Although McNabb’s initial argument was motivated by considering the mean particle lifetime (MPLT) for a linear death process, he applied the ideas to study diffusion. We extend the work of these authors by deriving expressions for the MAT for a general one–dimensional linear advection–diffusion–reaction problem. Using a combination of continuum and discrete approaches, we show that MAT and MPLT are equivalent for certain uniform–to-uniform transitions; these results provide a practical interpretation for MAT, by directly linking the stochastic microscopic processes to a meaningful macroscopic timescale. We find that for more general transitions, the equivalence between MAT and MPLT does not hold. Unlike other critical time definitions, we show that it is possible to evaluate the MAT without solving the underlying partial differential equation (pde). This makes MAT a simple and attractive quantity for practical situations. Finally, our work explores the accuracy of certain approximations derived using the MAT, showing that useful approximations for nonlinear kinetic processes can be obtained, again without treating the governing pde directly.
Resumo:
It appears that few of the students holding ‘socially idealistic’ goals upon entering law school actually maintain these upon graduation. The critical legal narrative, which explains and seeks to act upon this shift in the graduate’s ‘legal identity’, posits that these ideals are repressed through power relations that create passive receptacles into which professional ideologies can be deposited, in the interests of those advantaged by the social and legal status quo. Using the work of Michel Foucault, this paper unpacks the assumptions underpinning this narrative, particularly its arguments about ideology, power, and the subject. In doing so, it will argue this narrative provides an untenable basis for political action within legal education. By interrogating this narrative, this paper provides a new way of understanding the construction of the legal identity through legal education, and a new basis for political action within law school.
Resumo:
This article explores power within legal education scholarship. It suggests that power relations are not effectively reflected on within this scholarship, and it provokes legal educators to consider power more explicitly and effectively. It then outlines in-depth a conceptual and methodological approach based on Michel Foucault’s concept of ‘governmentality’ to assist in such an analysis. By detailing the conceptual moves required in order to research power in legal education more effectively, this article seeks to stimulate new reflection and thought about the practice and scholarship of legal education, and allow for political interventions to become more ethically sensitive and potentially more effective.
Resumo:
Those working in the critical criminology tradition have been centrally concerned with the social construction, variability and contingency of the criminal label. The concern is no less salient to a consideration of critical criminology itself and any history of critical criminology (in Australia or elsewhere) should aim itself to be critical in this sense. The point applies with equal force to both of the terms ‘critical’ and ‘criminology’. The want of a stable theoretical object has meant that criminology itself needs to be seen not as a distinct discipline but as a composite intellectual and governmental hybrid, a field of studies that overlaps and intersects many others (sociology, law, psychology, history, anthropology, social work, media studies and youth studies to name only a few). In consequence, much of the most powerful work on subjects of criminological inquiry is undertaken by scholars who do not necessarily define themselves as criminologists first and foremost, or at all. For reasons that should later become obvious this is even more pronounced in the Australian context. Although we may appear at times to be claiming such work for criminology, our purpose is to recognize its impact on and in critical criminology in Australia.
Resumo:
In recent years, a number of phylogenetic methods have been developed for estimating molecular rates and divergence dates under models that relax the molecular clock constraint by allowing rate change throughout the tree. These methods are being used with increasing frequency, but there have been few studies into their accuracy. We tested the accuracy of several relaxed-clock methods (penalized likelihood and Bayesian inference using various models of rate change) using nucleotide sequences simulated on a nine-taxon tree. When the sequences evolved with a constant rate, the methods were able to infer rates accurately, but estimates were more precise when a molecular clock was assumed. When the sequences evolved under a model of autocorrelated rate change, rates were accurately estimated using penalized likelihood and by Bayesian inference using lognormal and exponential models of rate change, while other models did not perform as well. When the sequences evolved under a model of uncorrelated rate change, only Bayesian inference using an exponential rate model performed well. Collectively, the results provide a strong recommendation for using the exponential model of rate change if a conservative approach to divergence time estimation is required. A case study is presented in which we use a simulation-based approach to examine the hypothesis of elevated rates in the Cambrian period, and it is found that these high rate estimates might be an artifact of the rate estimation method. If this bias is present, then the ages of metazoan divergences would be systematically underestimated. The results of this study have implications for studies of molecular rates and divergence dates.
Resumo:
This paper presents an experimental investigation into the detection of excessive Diesel knock using acoustic emission signals. Three different dual-fuel Diesel engine operating regimes were induced into a compression ignition (Diesel) engine operating on both straight Diesel fuel and two different mixtures of fumigated ethanol and Diesel. The experimentally induced engine operating regimes were; normal, or Diesel only operation, acceptable dual-fuel operation and dual-fuel operation with excessive Diesel knock. During the excessive Diesel knock operating regime, high rates of ethanol substitution induced potentially damaging levels of Diesel knock. Acoustic emission data was captured along with cylinder pressure, crank-angle encoder, and top-dead centre signals for the different engine operating regimes. Using these signals, it was found that acoustic emission signals clearly distinguished between the two acceptable operating regimes and the operating regime experiencing excessive Diesel knock. It was also found that acoustic emission sensor position is critical. The acoustic emission sensor positioned on the block of the engine clearly related information concerning the level of Diesel knock occurring in the engine whist the sensor positioned on the head of the engine gave no indication concerning Diesel knock severity levels.
Resumo:
Purpose – The rapidly changing role of capital city airports has placed demands on surrounding infrastructure. The need for infrastructure management and coordination is increasing as airports and cities grow and share common infrastructure frameworks. The purpose of this paper is to document the changing context in Australia, where the privatisation of airports has stimulated considerable land development with resulting pressures on surrounding infrastructure provision. It aims to describe a tool that is being developed to support decision-making between various stakeholders in the airport region. The use of planning support systems improves both communication and data transfer between stakeholders and provides a foundation for complex decisions on infrastructure. Design/methodology/approach – The research uses a case study approach and focuses on Brisbane International Airport and Brisbane City Council. The research is primarily descriptive and provides an empirical assessment of the challenges of developing and implementing planning support systems as a tool for governance and decision-making. Findings – The research assesses the challenges in implementing a common data platform for stakeholders. Agency data platforms and models, traditional roles in infrastructure planning, and integrating similar data platforms all provide barriers to sharing a common language. The use of a decision support system has to be shared by all stakeholders with a common platform that can be versatile enough to support scenarios and changing conditions. The use of iPadss for scenario modelling provides stakeholders the opportunity to interact, compare scenarios and views, and react with the modellers to explore other options. Originality/value – The research confirms that planning support systems have to be accessible and interactive by their users. The Airport City concept is a new and evolving focus for airport development and will place continuing pressure on infrastructure servicing. A coordinated and efficient approach to infrastructure decision-making is critical, and an interactive planning support system that can model infrastructure scenarios provides a sound tool for governance.