948 resultados para Relative and point positioning
Resumo:
Background. The positive health and wellbeing effects of social support have been consistently demonstrated in the literature since the late 1970s. However, a better understanding of the effects of age and sex is required. Method. We examined the factor structure and reliability of Kessler's Perceived Social Support (KPSS) measure in a community-based sample that comprised younger and older adult cohorts from the Australian Twin Registry (ATR), totalling 11,389 males and females aged 18-95, of whom 887 were retested 25 months later. Results. Factor analysis consistently identified seven factors: support from spouse, twin, children, parents, relatives, friends and helping support. Internal reliability for the seven dimensions ranged from 0.87 to 0.71 and test-retest reliability ranged from 0.75 to 0.48. Perceived support was only marginally higher in females. Age dependencies were explored. Across the age range, there was a slight decline (more marked in females) in the perceived support from spouse, parent and friend, a slight increase in perceived relative and helping support for males but none for females, a substantial increase in the perceived support from children for males and females and a negligible decline in total KPSS for females against a negligible increase for males. The perceived support from twin remained constant. Females were more likely to have a confidant, although this declined with age whilst increasing with age for males. Conclusions. Total scores for perceived social support conflate heterogeneous patterns on sub-scales that differ markedly by age and sex. Our paper describes these relationships in detail in a very large Australian sample.
Resumo:
Objective: Secondary analyses of a previously conducted 1-year randomized controlled trial were performed to assess the application of responder criteria in patients with knee osteoarthritis (OA) using different sets of responder criteria developed by the Osteoarthritis Research Society International (OARSI) (Propositions A and B) for intra-articular drugs and Outcome Measures in Arthritis Clinical Trials (OMERACT)-OARSI (Proposition D). Methods: Two hundred fifty-five patients with knee OA were randomized to appropriate care with hylan G-F 20 (AC + H) or appropriate care without hylan G-F 20 (AC). A patient was defined as a responder at month 12 based on change in Western Ontario and McMaster Universities Osteoarthritis Index pain and function (0-100 normalized scale) and patient global assessment of OA in the study knee (at least one-category improvement in very poor, poor, fair, good and very good). All propositions incorporate both minimum relative and absolute changes. Results: Results demonstrated that statistically significant differences in responders between treatment groups, in favor of hylan G-F 20, were detected for Proposition A (AC + H = 53.5%, AC = 25.2%), Proposition B (AC + H = 56.7%, AC = 32.3%) and Proposition D (AC + H = 66.9%, AC = 42.5%). The highest effectiveness in both treatment groups was observed with Proposition D, whereas Proposition A resulted in the lowest effectiveness in both treatment groups. The treatment group differences always exceeded the required 20% minimum clinically important difference between groups established a priori, and were 28.3%, 24.4% and 24.4% for Propositions A, B and D, respectively. Conclusion: This analysis provides evidence for the capacity of OARSI and OMERACT-OARSI responder criteria to detect clinically important statistically detectable differences between treatment groups. (C) 2004 OsteoArthritis Research Society International. Published by Elsevier Ltd. All rights reserved.
Resumo:
Objective: To evaluate the protective eyewear promotion ( PEP) project, which was a comprehensive educational strategy to increase the use of appropriate protective eyewear by squash players. Methods: An ecological study design was used. Four squash venues in one playing association were randomly chosen to receive PEP and four in another association maintained usual practice and hence formed a control group. The primary evaluation measurements were surveys of cross sectional samples of players carried out before and after the intervention. The surveys investigated players' knowledge, behaviours, and attitudes associated with the use of protective eyewear. The survey carried out after the intervention also determined players' exposure to PEP. Univariate and multivariate analyses were undertaken to describe differences at PEP venues from pre- to post-intervention and to compare these with the control venues. Results: The PEP players had 2.4 times the odds (95% confidence interval, 1.3 to 4.2) of wearing appropriate eyewear compared with control group players post-intervention, relative to the groups' preintervention baselines. Components of PEP, such as stickers and posters and the availability and prominent positioning of the project eyewear, were found to contribute to players adopting favourable eyewear behaviours. Conclusions: Components of the PEP intervention were shown to be effective. The true success will be the sustainability and dissemination of the project, favourable eyewear behaviours, and evidence of the prevention of eye injuries long into the future.
Resumo:
Background: This paper describes SeqDoC, a simple, web-based tool to carry out direct comparison of ABI sequence chromatograms. This allows the rapid identification of single nucleotide polymorphisms (SNPs) and point mutations without the need to install or learn more complicated analysis software. Results: SeqDoC produces a subtracted trace showing differences between a reference and test chromatogram, and is optimised to emphasise those characteristic of single base changes. It automatically aligns sequences, and produces straightforward graphical output. The use of direct comparison of the sequence chromatograms means that artefacts introduced by automatic base-calling software are avoided. Homozygous and heterozygous substitutions and insertion/deletion events are all readily identified. SeqDoC successfully highlights nucleotide changes missed by the Staden package 'tracediff' program. Conclusion: SeqDoC is ideal for small-scale SNP identification, for identification of changes in random mutagenesis screens, and for verification of PCR amplification fidelity. Differences are highlighted, not interpreted, allowing the investigator to make the ultimate decision on the nature of the change.
Resumo:
The mechanical behavior of the vertebrate skull is often modeled using free-body analysis of simple geometric structures and, more recently, finite-element (FE) analysis. In this study, we compare experimentally collected in vivo bone strain orientations and magnitudes from the cranium of the American alligator with those extrapolated from a beam model and extracted from an FE model. The strain magnitudes predicted from beam and FE skull models bear little similarity to relative and absolute strain magnitudes recorded during in vivo biting experiments. However, quantitative differences between principal strain orientations extracted from the FE skull model and recorded during the in vivo experiments were smaller, and both generally matched expectations from the beam model. The differences in strain magnitude between the data sets may be attributable to the level of resolution of the models, the material properties used in the FE model, and the loading conditions (i.e., external forces and constraints). This study indicates that FE models and modeling of skulls as simple engineering structures may give a preliminary idea of how these structures are loaded, but whenever possible, modeling results should be verified with either in vitro or preferably in vivo testing, especially if precise knowledge of strain magnitudes is desired. (c) 2005 Wiley-Liss, Inc.
Resumo:
There is a widening gulf in change literature between theoretical notions of evolving organisational form and the emerging reality that old and new organisational structures coexist. This paper explores this dichotomy in Enterprise Resource Planning change. It develops a cellular hierarchy framework to explain how different types of hierarchy coexist within the same organisation during the implementation of Enterprise Resource Planning. © 2006 The Author; Journal compilation © 2006 Blackwell Publishing Ltd.
Resumo:
Este estudo teve a intenção de analisar até que ponto o uso de diferentes bases de poder dos superiores hierárquicos predizem os níveis de engajamento no trabalho e de resiliência dos trabalhadores, colaborando para aumentar o conhecimento sobre o comportamento dos servidores públicos municipais, quanto aos níveis de engajamento no trabalho e resiliência apresentados. Partiu-se da definição de poder de French e Raven (1959): poder é a influência potencial que o agente O poderia causar no sujeito P; adotou-se o conceito de engajamento no trabalho de Schaufeli e Bakker (2003), que definem engajamento no trabalho como um construto motivacional positivo, caracterizado por vigor, dedicação e absorção, sempre relacionado ao trabalho, o qual implica sentimento de realização, envolve estado cognitivo positivo, é persistente no tempo, apresentando, assim, natureza motivacional e social e por fim utilizou-se o conceito de Grotberg (2005) que define resiliência como a capacidade humana para enfrentar, vencer e ser fortalecido ou transformado por experiências de adversidades . Para isto, definiu-se como objetivo geral testar a capacidade preditiva das bases de poder social dos chefes sobre a resiliência e o engajamento no trabalho, em servidores públicos municipais de Diadema - SP. Participaram deste estudo 95 servidores públicos municipais do município de Diadema, SP, com pequena maioria de indivíduos do sexo feminino (51,6%), com maior percentual de idade entre 25 e 40 anos (38,9%). A maioria dos participantes (60%) declarou possuir nível superior completo (35,8%) ou pós-graduação (24,2%). Utilizou-se os instrumentos: Escala de Bases de Poder do Supervisor (EBPS), escala desenvolvida por Martins e Guimarães (2007); Escala de Avaliação da Resiliência (EAR), escala construída por Martins, Siqueira e Emilio (2011) e a Escala de Engajamento no Trabalho de UTRECHT (UWES) que tiveram seus indicadores de validade e fidedignidade apurados neste estudo. Como resultado constatou-se parcialmente a existência de associação entre engajamento no trabalho e resiliência, pois engajamento no trabalho correlacionou-se com três dos cinco fatores de resiliência: adaptação positiva à mudanças, competência pessoal e persistência diante de dificuldade. Verificou-se que as dimensões que compõem a variável resiliência obtiveram médias ao redor do ponto quatro da escala de resiliência (frequentemente é verdade), indicando que os participantes frequentemente percebem a si mesmos como capazes de enfrentar as adversidades da vida em função da sua alta percepção de persistência, capazes de adaptar-se às mudanças, com bom nível de competência pessoal e de espiritualidade. Constatou-se que as médias nas dimensões que compõem a variável engajamento no trabalho ficaram muito próximas do ponto quatro da escala de engajamento (algumas vezes na semana), indicando que eles percebem em si um alto grau engajamento no trabalho, ou seja, que possuem vigor, são dedicados e deixam-se absorver pelo trabalho. Verificou-se ainda que os trabalhadores percebem o poder de perícia como o mais empregado pelos seus superiores hierárquicos com média de 4,46 (DP= 0,71). Por fim, os resultados obtidos apontaram que o papel e o posicionamento da chefia não provocaram impacto significativo em nenhuma das variáveis consequentes, portanto, bases de poder não explicam resiliência para os trabalhadores participantes desta pesquisa como também não predisseram engajamento no trabalho.
Resumo:
The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).
Resumo:
Although firms are faced by a large number of market introduction failures, research into a major driver of these failures, customer resistance to innovation, is surprisingly scarce. While most authors have investigated positive adoption decisions, this paper focuses instead on consumer resistance to innovation. The current study presents a conceptual framework which explicates the major components of consumer resistance: (1) rejection, (2) postponement, and (3) opposition, and discusses two main groups of antecedents to consumer resistance: (1) degree of change required and (2) conflicts with the consumer’s prior belief structure. This framework is explored with both a literature review and a qualitative focus group study. These joint efforts result in the formulation of a model of consumer resistance. Finally, the authors discuss several relevant theoretical and strategic implications, and point out directions for future research.
Resumo:
Compared to packings trays are more cost effective column internals because they create a large interfacial area for mass transfer by the interaction of the vapour on the liquid. The tray supports a mass of froth or spray which on most trays (including the most widely used sieve trays) is not in any way controlled. The two important results of the gas/liquid interaction are the tray efficiency and the tray throughput or capacity. After many years of practical experience, both may be predicted by empirical correlations, despite the lack of understanding. It is known that the tray efficiency is in part determined by the liquid flow pattern and the throughput by the liquid froth height which in turn depends on the liquid hold-up and vapour velocity. This thesis describes experimental work on sieve trays in an air-water simulator, 2.44 m in diameter. The liquid flow pattern, for flow rates similar to those used in commercial scale distillation, was observed experimentally by direct observation; by water-cooling, to simulate mass transfer; use of potassium permanganate dye to observe areas of longer residence time; and by height of clear liquid measurements across the tray and in the downcomer using manometers. This work presents experiments designed to evaluate flow control devices proposed to improve the gas liquid interaction and hence improve the tray efficiency and throughput. These are (a) the use of intermediate weirs to redirect liquid to the sides of the tray so as to remove slow moving/stagnant liquid and (b) the use of vapour-directing slots designed to use the vapour to cause liquid to be directed towards the outlet weir thus reducing the liquid hold-up at a given rate i.e. increased throughput. This method also has the advantage of removing slow moving/stagnant liquid. In the experiments using intermediate weirs, which were placed in the centre of the tray. it was found that in general the effect of an intermediate weir depends on the depth of liquid downstream of the weir. If the weir is deeper than the downstream depth it will cause the upstream liquid to be deeper than the downstream liquid. If the weir is not as deep as deep as the downstream depth it may have little or no effect on the upstream depth. An intermediate weir placed at an angle to the direction of flow of liquid increases the liquid towards the sides of the tray without causing an increase in liquid hold-up/ froth height. The maximum proportion of liquid caused to flow sideways by the weir is between 5% and 10%. Experimental work using vapour-directing slots on a rectangular sieve tray has shown that the horizontal momentum that is imparted to the liquid is dependent upon the size of the slot. If too much momentum is transferred to the liquid it causes hydraulic jumps to occur at the mouth of the slot coupled with liquid being entrained, The use of slots also helps to eliminate the hydraulic gradient across sieve trays and provides a more uniform froth height on the tray. By comparing the results obtained of the tray and point efficiencies, it is shown that a slotted tray reduces both values by approximately 10%. This reduction is due to the fact that with a slotted tray the liquid has a reduced residence time Ion the tray coupled also with the fact that large size bubbles are passing through the slots. The effectiveness of using vapour-directing slots on a full circular tray was investigated by using dye to completely colour the biphase. The removal of the dye by clear liquid entering the tray was monitored using an overhead camera. Results obtained show that the slots are successful in their aim of reducing slow moving liquid from the sides of the tray, The net effect of this is an increase in tray efficiency. Measurements of slot vapour-velocity found it to be approximately equal to the hole velocity.
Resumo:
This thesis consisted of two major parts, one determining the masking characteristics of pixel noise and the other investigating the properties of the detection filter employed by the visual system. The theoretical cut-off frequency of white pixel noise can be defined from the size of the noise pixel. The empirical cut-off frequency, i.e. the largest size of noise pixels that mimics the effect of white noise in detection, was determined by measuring contrast energy thresholds for grating stimuli in the presence of spatial noise consisting of noise pixels of various sizes and shapes. The critical i.e. minimum number of noise pixels per grating cycle needed to mimic the effect of white noise in detection was found to decrease with the bandwidth of the stimulus. The shape of the noise pixels did not have any effect on the whiteness of pixel noise as long as there was at least the minimum number of noise pixels in all spatial dimensions. Furthermore, the masking power of white pixel noise is best described when the spectral density is calculated by taking into account all the dimensions of noise pixels, i.e. width, height, and duration, even when there is random luminance only in one of these dimensions. The properties of the detection mechanism employed by the visual system were studied by measuring contrast energy thresholds for complex spatial patterns as a function of area in the presence of white pixel noise. Human detection efficiency was obtained by comparing human performance with an ideal detector. The stimuli consisted of band-pass filtered symbols, uniform and patched gratings, and point stimuli with randomised phase spectra. In agreement with the existing literature, the detection performance was found to decline with the increasing amount of detail and contour in the stimulus. A measure of image complexity was developed and successfully applied to the data. The accuracy of the detection mechanism seems to depend on the spatial structure of the stimulus and the spatial spread of contrast energy.
Resumo:
The use of quantitative methods has become increasingly important in the study of neuropathology and especially in neurodegenerative disease. Disorders such as Alzheimer's disease (AD) and the frontotemporal dementias (FTD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This chapter reviews the advantages and limitations of the different methods of quantifying pathological lesions in histological sections including estimates of density, frequency, coverage, and the use of semi-quantitative scores. The sampling strategies by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are described. In addition, data analysis methods commonly used to analysis quantitative data in neuropathology, including analysis of variance (ANOVA), polynomial curve fitting, multiple regression, classification trees, and principal components analysis (PCA), are discussed. These methods are illustrated with reference to quantitative studies of a variety of neurodegenerative disorders.
Resumo:
This paper surveys the literature on scale and scope economies in the water and sewerage industry. The magnitude of scale and scope economies determines the cost efficient configuration of any industry. In the case of a regulated sector, reliable estimates of these economies are relevant to inform reform proposals that promote vertical (un)bundling and mergers. The empirical evidence allows some general conclusions. First, there is considerable evidence for the existence of vertical scope economies between upstream water production and distribution. Second, there is only mixed evidence on the existence of (dis)economies of scope between water and sewerage activities. Third, economies of scale exist up to certain output level, and diseconomies of scale arise if the company increases its size beyond this level. However, the optimal scale of utilities also appears to vary considerably between countries. Finally, we briefly consider the implications of our findings for water pricing and point to several directions for necessary future empirical research on the measurement of these economies, and explaining their cross country variation.
Resumo:
Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.
Resumo:
Genomics, proteomics and metabolomics are three areas that are routinely applied throughout the drug-development process as well as after a product enters the market. This review discusses all three 'omics, reporting on the key applications, techniques, recent advances and expectations of each. Genomics, mainly through the use of novel and next-generation sequencing techniques, has advanced areas of drug discovery and development through the comparative assessment of normal and diseased-state tissues, transcription and/or expression profiling, side-effect profiling, pharmacogenomics and the identification of biomarkers. Proteomics, through techniques including isotope coded affinity tags, stable isotopic labeling by amino acids in cell culture, isobaric tags for relative and absolute quantification, multidirectional protein identification technology, activity-based probes, protein/peptide arrays, phage displays and two-hybrid systems is utilized in multiple areas through the drug development pipeline including target and lead identification, compound optimization, throughout the clinical trials process and after market analysis. Metabolomics, although the most recent and least developed of the three 'omics considered in this review, provides a significant contribution to drug development through systems biology approaches. Already implemented to some degree in the drug-discovery industry and used in applications spanning target identification through to toxicological analysis, metabolic network understanding is essential in generating future discoveries.