290 resultados para Classical Peronism


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The diagnosis of frailty is based on physical impairments and clinicians have indicated that early detection is one of the most effective methods for reducing the severity of physical frailty. Maybe, an alternative to the classical diagnosis could be the instrumentalization of classical functional testing, as Romberg test or Timed Get Up and Go Test. The aim of this study was (I) to measure and describe the magnitude of accelerometry values in the Romberg test in two groups of frail and non-frail elderly people through instrumentation with the iPhone 4®, (II) to analyse the performances and differences between the study groups, and (III) to analyse the performances and differences within study groups to characterise accelerometer responses to increasingly difficult challenges to balance. Methods This is a cross-sectional study of 18 subjects over 70 years old, 9 frail subjects and 9 non-frail subjects. The non-parametric Mann–Whitney U test was used for between-group comparisons in means values derived from different tasks. The Wilcoxon Signed-Rank test was used to analyse differences between different variants of the test in both independent study groups. Results The highest difference between groups was found in the accelerometer values with eyes closed and feet parallel: maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.01). Subjects with eyes open and feet parallel, greatest differences found between the groups were in the maximum peak acceleration in the lateral axis (p < 0.01), minimum peak acceleration in the lateral axis (p < 0.01) and minimum peak acceleration from the resultant vector (p < 0.001). With eyes closed and feet in tandem, the greatest differences found between the groups were in the minimum peak acceleration in the lateral axis (p < 0.01). Conclusions The accelerometer fitted in the iPhone 4® is able to study and analyse the kinematics of the Romberg test between frail and non-frail elderly people. In addition, the results indicate that the accelerometry values also were significantly different between the frail and non-frail groups, and that values from the accelerometer accelerometer increased as the test was made more complicated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large multi-site image-analysis studies have successfully discovered genetic variants that affect brain structure in tens of thousands of subjects scanned worldwide. Candidate genes have also associated with brain integrity, measured using fractional anisotropy in diffusion tensor images (DTI). To evaluate the heritability and robustness of DTI measures as a target for genetic analysis, we compared 417 twins and siblings scanned on the same day on the same high field scanner (4-Tesla) with two protocols: (1) 94-directions; 2mm-thick slices, (2) 27-directions; 5mm-thickness. Using mean FA in white matter ROIs and FA skeletons derived using FSL, we (1) examined differences in voxelwise means, variances, and correlations among the measures; and (2) assessed heritability with structural equation models, using the classical twin design. FA measures from the genu of the corpus callosum were highly heritable, regardless of protocol. Genome-wide analysis of the genu mean FA revealed differences across protocols in the top associations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Origin-Destination matrices (ODM) estimation can benefits of the availability of sample trajectories which can be measured thanks to recent technologies. This paper focus on the case of transport networks where traffic counts are measured by magnetic loops and sample trajectories available. An example of such network is the city of Brisbane, where Bluetooth detectors are now operating. This additional data source is used to extend the classical ODM estimation to a link-specific ODM (LODM) one using a convex optimisation resolution that incorporates networks constraints as well. The proposed algorithm is assessed on a simulated network.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A central dimension of the State’s responsibility in a liberal democracy and any just society is the protection of individuals’ central rights and freedoms, and the creation of the minimum conditions under which each individual has an opportunity to lead a life of sufficient equality, dignity and value. A special subset of this responsibility is to protect those who are unable to protect themselves from genuine harm. Substantial numbers of children suffer serious physical, emotional and sexual abuse, and neglect at the hands of their parents and caregivers or by other known parties. Child abuse and neglect occurs in a situation of extreme power asymmetry. The physical, social, behavioural and economic costs to the individual, and the social and economic costs to communities, are vast. Children are not generally able to protect themselves from serious abuse and neglect. This enlivens both the State’s responsibility to protect the child, and the debate about how that responsibility can and should be discharged. A core question arises for all societies, given that most serious child maltreatment occurs in the family sphere, is unlikely to be disclosed, causes substantial harm to both individual and community, and infringes fundamental individual rights and freedoms. The question is: how can society identify these situations so that the maltreatment can be interrupted, the child’s needs for security and safety, and health and other rehabilitation can be met, and the family’s needs can be addressed to reduce the likelihood of recurrence? This chapter proposes a theoretical framework applicable for any society that is considering justifiable and effective policy approaches to identify and respond to cases of serious child abuse and neglect. The core of the theoretical framework is based on major principles from both classical liberal political philosophy (Locke and Mill), and leading political philosophers from the twentieth century and the first part of the new millennium (Rawls, Rorty, Okin, Nussbaum), and is further situated within fundamental frameworks of civil and criminal law, and health and economics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Deterrence-based initiatives form a cornerstone of many road safety countermeasures. This approach is informed by Classical Deterrence Theory, which proposes that individuals will be deterred from committing offences if they fear the perceived consequences of the act, especially the perceived certainty, severity and swiftness of sanctions. While deterrence-based countermeasures have proven effective in reducing a range of illegal driving behaviours known to cause crashes such as speeding and drink driving, the exact level of exposure, and how the process works, remains unknown. As a result the current study involved a systematic review of the literature to identify theoretical advancements within deterrence theory that has informed evidence-based practice. Studies that reported on perceptual deterrence between 1950 and June 2015 were searched in electronic databases including PsychINFO and ScienceDirect, both within road safety and non-road safety fields. This review indicated that scientific efforts to understand deterrence processes for road safety were most intense during the 1970s and 1980s. This era produced competing theories that postulated both legal and non-legal factors can influence offending behaviours. Since this time, little theoretical progression has been made in the road safety arena, apart from Stafford and Warr's (1993) reconceptualisation of deterrence that illuminated the important issue of punishment avoidance. In contrast, the broader field of criminology has continued to advance theoretical knowledge by investigating a range of individual difference-based factors proposed to influence deterrent processes, including: moral inhibition, social bonding, self-control, tendencies to discount the future, etc. However, this scientific knowledge has not been directed towards identifying how to best utilise deterrence mechanisms to improve road safety. This paper will highlight the implications of this lack of progression and provide direction for future research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Driving on an approach to a signalized intersection while distracted is relatively risky, as potential vehicular conflicts and resulting angle collisions tend to be relatively more severe compared to other locations. Given the prevalence and importance of this particular scenario, the objective of this study was to examine the decisions and actions of distracted drivers during the onset of yellow lights. Driving simulator data were obtained from a sample of 69 drivers under baseline and handheld cell phone conditions at the University of Iowa – National Advanced Driving Simulator. Explanatory variables included age, gender, cell phone use, distance to stop-line, and speed. Although there is extensive research on drivers’ responses to yellow traffic signals, the examinations have been conducted from a traditional regression-based approach, which do not necessary provide the underlying relations and patterns among the sampled data. In this paper, we exploit the benefits of both classical statistical inference and data mining techniques to identify the a priori relationships among main effects, non-linearities, and interaction effects. Results suggest that the probability of yellow light running increases with the increase in driving speed at the onset of yellow. Both young (18–25 years) and middle-aged (30–45 years) drivers reveal reduced propensity for yellow light running whilst distracted across the entire speed range, exhibiting possible risk compensation during this critical driving situation. The propensity for yellow light running for both distracted male and female older (50–60 years) drivers is significantly higher. Driver experience captured by age interacts with distraction, resulting in their combined effect having slower physiological response and being distracted particularly risky.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Speculative property developers, criticised for building dog boxes and the slums of tomorrow, are generally hated by urban planners and the public alike. But the doors of state governments are seemingly always open to developers and their lobbyists. Politicians find it hard to say no to the demands of the development industry for concessions because of the contribution housing construction makes to the economic bottom line and because there is a need for well located housing. New supply is also seen as a solution to declining housing affordability. Classical economic theory however is too simplistic for housing supply. Instead, an offshoot of Game Theory - Market Design – not only offers greater insight into apartment supply but also can simultaneously address price, design and quality issues. New research reveals the most significant risk in residential development is settlement risk – when buyers fail to proceed with their purchase despite there being a pre-sale contract. At the point of settlement, the developer has expended all the project funds only to see forecast revenue evaporate. While new buyers may be found, this process is likely to strip the profitability out of the project. As the global financial crisis exposed, buyers are inclined to walk if property values slide. This settlement problem reflects a poor legal mechanism (the pre-sale contract), and a lack of incentive for truthfulness. A second problem is the search costs of finding buyers. At around 10% of project costs, pre-sales are more expensive to developers than finance. This is where Market Design comes in.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We have designed, synthesized and utilized a new non-fullerene electron acceptor, 9,9′-(9,9-dioctyl-9H-fluorene-2,7-diyl)bis(2,7-dioctyl-4-(octylamino)benzo[lmn][3,8]phenanthroline-1,3,6,8(2H,7H)-tetraone) (B2), for use in solution-processable bulk-heterojunction devices. B2 is based on a central fluorene moiety, which was capped at both ends with an electron-accepting naphthalenediimide functionality. B2 exhibited excellent solubility (>30 mg mL−1 in chloroform), high thermal and photochemical stability, and appropriate energy levels for use with the classical polymer donor regioregular poly(3-hexylthiophene). A power conversion efficiency of 1.16 % was achieved for primitive bulk-heterojunction devices with a high fill factor of approximately 54 %.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this study, 1,833 systemic sclerosis (SSc) cases and 3,466 controls were genotyped with the Immunochip array. Classical alleles, amino acid residues, and SNPs across the human leukocyte antigen (HLA) region were imputed and tested. These analyses resulted in a model composed of six polymorphic amino acid positions and seven SNPs that explained the observed significant associations in the region. In addition, a replication step comprising 4,017 SSc cases and 5,935 controls was carried out for several selected non-HLA variants, reaching a total of 5,850 cases and 9,401 controls of European ancestry. Following this strategy, we identified and validated three SSc risk loci, including DNASE1L3 at 3p14, the SCHIP1-IL12A locus at 3q25, and ATG5 at 6q21, as well as a suggested association of the TREH-DDX6 locus at 11q23. The associations of several previously reported SSc risk loci were validated and further refined, and the observed peak of association in PXK was related to DNASE1L3. Our study has increased the number of known genetic associations with SSc, provided further insight into the pleiotropic effects of shared autoimmune risk factors, and highlighted the power of dense mapping for detecting previously overlooked susceptibility loci.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Our understanding of the origin and fate of the IgE-switched B cell has been markedly improved by studies in mouse models. The immediate precursor of the IgE-switched B cell is either a relatively naive nonswitched B cell or a mature IgG-switched B cell. These 2 routes are referred to as the direct and indirect pathways, respectively. IgE responses derived from each pathway differ significantly, largely reflecting the difference in time spent in a germinal center and thus time for clonal expansion, somatic hypermutation, affinity maturation, and acquisition of a memory phenotype. The clinical and therapeutic implications for IgE responses in human subjects are still a matter of debate, largely because the immunization procedures used in the animal models are significantly different from classical atopic sensitization to allergens from pollen and mites. On the basis of the limited information available, it seems likely that these atopic IgE responses are characterized by a relatively low IgG/IgE ratio, low B-cell memory, and modest affinity maturation, which fits well with the direct switching pathway. It is still unresolved how the IgE response evolves to cover a wide epitope repertoire involving many epitopes per allergen, as well as many different allergens from a single allergen source. © 2013 American Academy of Allergy, Asthma & Immunology.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we image the highly confined long range plasmons of a nanoscale metal stripe waveguide using quantum emitters. Plasmons were excited using a highly focused 633 nm laser beam and a specially designed grating structure to provide stronger incoupling to the desired mode. A homogeneous thin layer of quantum dots was used to image the near field intensity of the propagating plasmons on the waveguide. We observed that the photoluminescence is quenched when the QD to metal surface distance is less than 10 nm. The optimised spacer layer thickness for the stripe waveguides was found to be around 20 nm. Authors believe that the findings of this paper prove beneficial for the development of plasmonic devices utilising stripe waveguides.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Some statistical procedures already available in literature are employed in developing the water quality index, WQI. The nature of complexity and interdependency that occur in physical and chemical processes of water could be easier explained if statistical approaches were applied to water quality indexing. The most popular statistical method used in developing WQI is the principal component analysis (PCA). In literature, the WQI development based on the classical PCA mostly used water quality data that have been transformed and normalized. Outliers may be considered in or eliminated from the analysis. However, the classical mean and sample covariance matrix used in classical PCA methodology is not reliable if the outliers exist in the data. Since the presence of outliers may affect the computation of the principal component, robust principal component analysis, RPCA should be used. Focusing in Langat River, the RPCA-WQI was introduced for the first time in this study to re-calculate the DOE-WQI. Results show that the RPCA-WQI is capable to capture similar distribution in the existing DOE-WQI.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In [8] the authors developed a logical system based on the definition of a new non-classical connective ⊗ capturing the notion of reparative obligation. The system proved to be appropriate for handling well-known contrary-to-duty paradoxes but no model-theoretic semantics was presented. In this paper we fill the gap and define a suitable possible-world semantics for the system for which we can prove soundness and completeness. The semantics is a preference-based non-normal one extending and generalizing semantics for classical modal logics.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Theodor Adorno was opposed to the cinema because he felt it was too close to reality, and ipso facto an extension of ideological Capital, as he wrote in 1944 in Dialectic of Enlightenment. What troubled Adorno was the iconic nature of cinema – the semiotic category invented by C. S. Peirce where the signifier (sign) does not merely signify, in the arbitrary capacity attested by Saussure, but mimics the formal-visual qualities of its referent. Iconicity finds its perfect example in the film’s ingenuous surface illusion of an unmediated reality – its genealogy (the iconic), since classical antiquity, lay in the Greek term eikōn which meant “image,” to refer to the ancient portrait statues of victorious athletes which were thought to bear a direct similitude with their parent divinities. For the postwar, Hollywood-film spectator, Adorno said, “the world outside is an extension of the film he has just left,” because realism is a precise instrument for the manipulation of the mass spectator by the culture industry, for which the filmic image is an advertisement for the world unedited. Mimesis, or the reproduction of reality, is a “mere reproduction of the economic base.” It is precisely film’s iconicity, then, its “realist aesthetic . . . [that] makes it inseparable from its commodity character.”...

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ankylosing spondylitis (AS) is a common, highly heritable, inflammatory arthritis for which HLA-B*27 is the major genetic risk factor, although its role in the aetiology of AS remains elusive. To better understand the genetic basis of the MHC susceptibility loci, we genotyped 7,264 MHC SNPs in 22,647 AS cases and controls of European descent. We impute SNPs, classical HLA alleles and amino-acid residues within HLA proteins, and tested these for association to AS status. Here we show that in addition to effects due to HLA-B*27 alleles, several other HLA-B alleles also affect susceptibility. After controlling for the associated haplotypes in HLA-B, we observe independent associations with variants in the HLA-A, HLA-DPB1 and HLA-DRB1 loci. We also demonstrate that the ERAP1 SNP rs30187 association is not restricted only to carriers of HLA-B*27 but also found in HLA-B*40:01 carriers independently of HLA-B*27 genotype.