959 resultados para Classical Peronism
Resumo:
The desire to solve problems caused by socket prostheses in transfemoral amputees and the acquired success of osseointegration in the dental application has led to the introduction of osseointegration in the orthopedic surgery. Since its first introduction in 1990 in Gothenburg Sweden the osseointegrated (OI) orthopedic fixation has proven several benefits[1]. The surgery consists of two surgical procedures followed by a lengthy rehabilitation program. The rehabilitation program after an OI implant includes a specific training period with a short training prosthesis. Since mechanical loading is considered to be one of the key factors that influence bone mass and the osseointegration of bone-anchored implants, the rehabilitation program will also need to include some form of load bearing exercises (LBE). To date there are two frequently used commercially available human implants. We can find proof in the literature that load bearing exercises are performed by patients with both types of OI implants. We refer to two articles, a first one written by Dr. Aschoff and all and published in 2010 in the Journal of Bone and Joint Surgery.[2] The second one presented by Hagberg et al in 2009 gives a very thorough description of the rehabilitation program of TFA fitted with an OPRA implant. The progression of the load however is determined individually according to the residual skeleton’s quality, pain level and body weight of the participant.[1] Patients are using a classical bathroom weighing scale to control the load on the implant during the course of their rehabilitation. The bathroom scale is an affordable and easy-to-use device but it has some important shortcomings. The scale provides instantaneous feedback to the patient only on the magnitude of the vertical component of the applied force. The forces and moments applied along and around the three axes of the implant are unknown. Although there are different ways to assess the load on the implant for instance through inverse dynamics in a motion analysis laboratory [3-6] this assessment is challenging. A recent proof- of-concept study by Frossard et al (2009) showed that the shortcomings of the weighing scale can be overcome by a portable kinetic system based on a commercial transducer[7].
Resumo:
In classical fear conditioning a neutral conditioned stimulus (CS), is paired with an aversive unconditioned stimulus (US). The CS thereby acquires the capacity to elicit a fear response. This type of associative learning is thought to require co-activation of principal neurons in the lateral nucleus of the amygdala (LA) by two sets of synaptic inputs...
Resumo:
In classical fear conditioning a neutral conditioned stimulus (CS) such as a tone, is paired with an aversive unconditioned stimulus (US) such as a shock. The CS thereby acquires the capacity to elicit a fear response. This type of associative learning is thought to require co-activation of principle neurons in the lateral nucleus of the amygdala (LA) by two sets of synaptic inputs, a weak CS and a strong US...
Resumo:
Programming is a subject that many beginning students find difficult. The PHP Intelligent Tutoring System (PHP ITS) has been designed with the aim of making it easier for novices to learn the PHP language in order to develop dynamic web pages. Programming requires practice. This makes it necessary to include practical exercises in any ITS that supports students learning to program. The PHP ITS works by providing exercises for students to solve and then providing feedback based on their solutions. The major challenge here is to be able to identify many semantically equivalent solutions to a single exercise. The PHP ITS achieves this by using theories of Artificial Intelligence (AI) including first-order predicate logic and classical and hierarchical planning to model the subject matter taught by the system. This paper highlights the approach taken by the PHP ITS to analyse students’ programs that include a number of program constructs that are used by beginners of web development. The PHP ITS was built using this model and evaluated in a unit at the Queensland University of Technology. The results showed that it was capable of correctly analysing over 96 % of the solutions to exercises supplied by students.
Resumo:
The generation of solar thermal power is dependent upon the amount of sunlight exposure,as influenced by the day-night cycle and seasonal variations. In this paper, robust optimisation is applied to the design of a power block and turbine, which is generating 30 MWe from a concentrated solar resource of 560oC. The robust approach is important to attain a high average performance (minimum efficiency change) over the expected operating ranges of temperature, speed and mass flow. The final objective function combines the turbine performance and efficiency weighted by the off-design performance. The resulting robust optimisation methodology as presented in the paper gives further information that greatly aids in the design of non-classical power blocks through considering off-design conditions and resultant performance.
Resumo:
The increasing linguistic and cultural diversity of our contemporary world points to the salience of maintaining and developing Heritage Language of ethnic minority groups. The mutually constitutive effect between Heritage Language learning and ethnic identity construction has been well documented in the literature. Classical social psychological work often quantitatively structures this phenomenon in a predictable linear relationship. In contrast, poststructural scholarship draws on qualitative approaches to claim the malleable and multiple dynamics behind the phenomenon. The two schools oppose but complement each other. Nevertheless, both schools struggle to capture the detailed and nuanced construction of ethnic identity through Heritage Language learning. Different from the extant research, we make an attempt to ethno-methodologically unearth the nuisances and predicaments embedded in the reflexive, subtle, and multi-layered identity constructions through nuanced, inter-nested language practices. Drawing on data from the qualitative phase of a large project, we highlight some small but powerful moments abstracted from the interview accounts of five Chinese Australian young people. Firstly, we zoom in on the life politics behind the ‘seen but unnoticed’ stereotype that looking Chinese means being able to speak Chinese. Secondly, we speculate the power relations between the speaker and the listener through the momentary and inadvertent breaches of the taken-for-granted stereotype. Next, we unveil how learning Chinese has become an accountably rational priority for these young Chinese Australians. Finally, we argue that the normalised stereotype becomes visible and hence stable when it is breached – a practical accomplishment that we term ‘habitus realisation’.
Resumo:
What type of probability theory best describes the way humans make judgments under uncertainty and decisions under conflict? Although rational models of cognition have become prominent and have achieved much success, they adhere to the laws of classical probability theory despite the fact that human reasoning does not always conform to these laws. For this reason we have seen the recent emergence of models based on an alternative probabilistic framework drawn from quantum theory. These quantum models show promise in addressing cognitive phenomena that have proven recalcitrant to modeling by means of classical probability theory. This review compares and contrasts probabilistic models based on Bayesian or classical versus quantum principles, and highlights the advantages and disadvantages of each approach.
Resumo:
In The Climate Change Review, Ross Garnaut emphasised that ‘Climate change and climate change mitigation will bring about major structural change in the agriculture, forestry and other land use sectors’. He provides this overview of the effects of climate change on food demand and supply: ‘Domestic food production in many developing countries will be at immediate risk of reductions in agricultural productivity due to crop failure, livestock loss, severe weather events and new patterns of pests and diseases.’ He observes that ‘Changes to local climate and water availability will be key determinants of where agricultural production occurs and what is produced.’ Gert Würtenberger has commented that modern plant breeding is particularly concerned with addressing larger issues about nutrition, food security and climate change: ‘Modern plant breeding has an increasing importance with regard to the continuously growing demand for plants for nutritional and feeding purposes as well as with regard to renewal energy sources and the challenges caused by climate changes.’ Moreover, he notes that there is a wide array of scientific and technological means of breeding new plant varieties: ‘Apart from classical breeding, technologies have an important role in the development of plants that satisfy the various requirements that industrial and agricultural challenges expect to be fulfilled.’ He comments: ‘Plant variety rights, as well as patents which protect such results, are of increasingly high importance to the breeders and enterprises involved in plant development programmes.’ There has been larger interest in the intersections between sustainable agriculture, environmental protection and food security. The debate over agricultural intellectual property is a polarised one, particularly between plant breeders, agricultural biotechnology companies and a range of environmentalist groups. Susan Sell comments that there are complex intellectual property battles surrounding agriculture: 'Seeds are at the centre of a complex political dynamic between stakeholders. Access to seeds concerns the balance between private rights and public obligations, private ownership and the public domain, and commercial versus humanitarian objectives.' Part I of this chapter considers debates in respect of plant breeders’ rights, food security and climate change in relation to the UPOV Convention 1991. Part II explores efforts by agricultural biotechnology companies to patent climate-ready crops. Part III considers the report of the Special Rapporteur for Food, Olivier De Schutter. It looks at a variety of options to encourage access to plant varieties with climate adaptive or mitigating properties.
Resumo:
Background The diagnosis of frailty is based on physical impairments and clinicians have indicated that early detection is one of the most effective methods for reducing the severity of physical frailty. Maybe, an alternative to the classical diagnosis could be the instrumentalization of classical functional testing, as Romberg test or Timed Get Up and Go Test. The aim of this study was (I) to measure and describe the magnitude of accelerometry values in the Romberg test in two groups of frail and non-frail elderly people through instrumentation with the iPhone 4®, (II) to analyse the performances and differences between the study groups, and (III) to analyse the performances and differences within study groups to characterise accelerometer responses to increasingly difficult challenges to balance. Methods This is a cross-sectional study of 18 subjects over 70 years old, 9 frail subjects and 9 non-frail subjects. The non-parametric Mann–Whitney U test was used for between-group comparisons in means values derived from different tasks. The Wilcoxon Signed-Rank test was used to analyse differences between different variants of the test in both independent study groups. Results The highest difference between groups was found in the accelerometer values with eyes closed and feet parallel: maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.01). Subjects with eyes open and feet parallel, greatest differences found between the groups were in the maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.001). With eyes closed and feet in tandem, the greatest differences found between the groups were in the minimum peak acceleration in the lateral axis (p < 0.01). Conclusions The accelerometer fitted in the iPhone 4® is able to study and analyse the kinematics of the Romberg test between frail and non-frail elderly people. In addition, the results indicate that the accelerometry values also were significantly different between the frail and non-frail groups, and that values from the accelerometer accelerometer increased as the test was made more complicated.
Resumo:
Large multi-site image-analysis studies have successfully discovered genetic variants that affect brain structure in tens of thousands of subjects scanned worldwide. Candidate genes have also associated with brain integrity, measured using fractional anisotropy in diffusion tensor images (DTI). To evaluate the heritability and robustness of DTI measures as a target for genetic analysis, we compared 417 twins and siblings scanned on the same day on the same high field scanner (4-Tesla) with two protocols: (1) 94-directions; 2mm-thick slices, (2) 27-directions; 5mm-thickness. Using mean FA in white matter ROIs and FA skeletons derived using FSL, we (1) examined differences in voxelwise means, variances, and correlations among the measures; and (2) assessed heritability with structural equation models, using the classical twin design. FA measures from the genu of the corpus callosum were highly heritable, regardless of protocol. Genome-wide analysis of the genu mean FA revealed differences across protocols in the top associations.
Resumo:
Origin-Destination matrices (ODM) estimation can benefits of the availability of sample trajectories which can be measured thanks to recent technologies. This paper focus on the case of transport networks where traffic counts are measured by magnetic loops and sample trajectories available. An example of such network is the city of Brisbane, where Bluetooth detectors are now operating. This additional data source is used to extend the classical ODM estimation to a link-specific ODM (LODM) one using a convex optimisation resolution that incorporates networks constraints as well. The proposed algorithm is assessed on a simulated network.
Resumo:
A central dimension of the State’s responsibility in a liberal democracy and any just society is the protection of individuals’ central rights and freedoms, and the creation of the minimum conditions under which each individual has an opportunity to lead a life of sufficient equality, dignity and value. A special subset of this responsibility is to protect those who are unable to protect themselves from genuine harm. Substantial numbers of children suffer serious physical, emotional and sexual abuse, and neglect at the hands of their parents and caregivers or by other known parties. Child abuse and neglect occurs in a situation of extreme power asymmetry. The physical, social, behavioural and economic costs to the individual, and the social and economic costs to communities, are vast. Children are not generally able to protect themselves from serious abuse and neglect. This enlivens both the State’s responsibility to protect the child, and the debate about how that responsibility can and should be discharged. A core question arises for all societies, given that most serious child maltreatment occurs in the family sphere, is unlikely to be disclosed, causes substantial harm to both individual and community, and infringes fundamental individual rights and freedoms. The question is: how can society identify these situations so that the maltreatment can be interrupted, the child’s needs for security and safety, and health and other rehabilitation can be met, and the family’s needs can be addressed to reduce the likelihood of recurrence? This chapter proposes a theoretical framework applicable for any society that is considering justifiable and effective policy approaches to identify and respond to cases of serious child abuse and neglect. The core of the theoretical framework is based on major principles from both classical liberal political philosophy (Locke and Mill), and leading political philosophers from the twentieth century and the first part of the new millennium (Rawls, Rorty, Okin, Nussbaum), and is further situated within fundamental frameworks of civil and criminal law, and health and economics.
Resumo:
Deterrence-based initiatives form a cornerstone of many road safety countermeasures. This approach is informed by Classical Deterrence Theory, which proposes that individuals will be deterred from committing offences if they fear the perceived consequences of the act, especially the perceived certainty, severity and swiftness of sanctions. While deterrence-based countermeasures have proven effective in reducing a range of illegal driving behaviours known to cause crashes such as speeding and drink driving, the exact level of exposure, and how the process works, remains unknown. As a result the current study involved a systematic review of the literature to identify theoretical advancements within deterrence theory that has informed evidence-based practice. Studies that reported on perceptual deterrence between 1950 and June 2015 were searched in electronic databases including PsychINFO and ScienceDirect, both within road safety and non-road safety fields. This review indicated that scientific efforts to understand deterrence processes for road safety were most intense during the 1970s and 1980s. This era produced competing theories that postulated both legal and non-legal factors can influence offending behaviours. Since this time, little theoretical progression has been made in the road safety arena, apart from Stafford and Warr's (1993) reconceptualisation of deterrence that illuminated the important issue of punishment avoidance. In contrast, the broader field of criminology has continued to advance theoretical knowledge by investigating a range of individual difference-based factors proposed to influence deterrent processes, including: moral inhibition, social bonding, self-control, tendencies to discount the future, etc. However, this scientific knowledge has not been directed towards identifying how to best utilise deterrence mechanisms to improve road safety. This paper will highlight the implications of this lack of progression and provide direction for future research.
Resumo:
Driving on an approach to a signalized intersection while distracted is relatively risky, as potential vehicular conflicts and resulting angle collisions tend to be relatively more severe compared to other locations. Given the prevalence and importance of this particular scenario, the objective of this study was to examine the decisions and actions of distracted drivers during the onset of yellow lights. Driving simulator data were obtained from a sample of 69 drivers under baseline and handheld cell phone conditions at the University of Iowa – National Advanced Driving Simulator. Explanatory variables included age, gender, cell phone use, distance to stop-line, and speed. Although there is extensive research on drivers’ responses to yellow traffic signals, the examinations have been conducted from a traditional regression-based approach, which do not necessary provide the underlying relations and patterns among the sampled data. In this paper, we exploit the benefits of both classical statistical inference and data mining techniques to identify the a priori relationships among main effects, non-linearities, and interaction effects. Results suggest that the probability of yellow light running increases with the increase in driving speed at the onset of yellow. Both young (18–25 years) and middle-aged (30–45 years) drivers reveal reduced propensity for yellow light running whilst distracted across the entire speed range, exhibiting possible risk compensation during this critical driving situation. The propensity for yellow light running for both distracted male and female older (50–60 years) drivers is significantly higher. Driver experience captured by age interacts with distraction, resulting in their combined effect having slower physiological response and being distracted particularly risky.
Resumo:
Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.