842 resultados para consistency in indexing
Resumo:
Both culture coverage and digital journalism are contemporary phenomena that have undergone several transformations within a short period of time. Whenever the media enters a period of uncertainty such as the present one, there is an attempt to innovate in order to seek sustainability, skip the crisis or find a new public. This indicates that there are new trends to be understood and explored, i.e., how are media innovating in a digital environment? Not only does the professional debate about the future of journalism justify the need to explore the issue, but so do the academic approaches to cultural journalism. However, none of the studies so far have considered innovation as a motto or driver and tried to explain how the media are covering culture, achieving sustainability and engaging with the readers in a digital environment. This research examines how European media which specialize in culture or have an important cultural section are innovating in a digital environment. Specifically, we see how these innovation strategies are being taken in relation to the approach to culture and dominant cultural areas, editorial models, the use of digital tools for telling stories, overall brand positioning and extensions, engagement with the public and business models. We conducted a mixed methods study combining case studies of four media projects, which integrates qualitative web features and content analysis, with quantitative web content analysis. Two major general-interest journalistic brands which started as physical newspapers – The Guardian (London, UK) and Público (Lisbon, Portugal) – a magazine specialized in international affairs, culture and design – Monocle (London, UK) – and a native digital media project that was launched by a cultural organization – Notodo, by La Fábrica – were the four case studies chosen. Findings suggest, on one hand, that we are witnessing a paradigm shift in culture coverage in a digital environment, challenging traditional boundaries related to cultural themes and scope, angles, genres, content format and delivery, engagement and business models. Innovation in the four case studies lies especially along the product dimensions (format and content), brand positioning and process (business model and ways to engage with users). On the other hand, there are still perennial values that are crucial to innovation and sustainability, such as commitment to journalism, consistency (to the reader, to brand extensions and to the advertiser), intelligent differentiation and the capability of knowing what innovation means and how it can be applied, since this thesis also confirms that one formula doesn´t suit all. Changing minds, exceeding cultural inertia and optimizing the memory of the websites, looking at them as living, organic bodies, which continuously interact with the readers in many different ways, and not as a closed collection of articles, are still the main challenges for some media.
Resumo:
Forest regrowth occupies an extensive and increasing area in the Amazon basin, but accurate assessment of the impact of regrowth on carbon and nutrient cycles has been hampered by a paucity of available allometric equations. We develop pooled and species-specific equations for total aboveground biomass for a study site in the eastern Amazon that had been abandoned for 15 years. Field work was conducted using randomized branch sampling, a rapid technique that has seen little use in tropical forests. High consistency of sample paths in randomized branch sampling, as measured by the standard error of individual paths (14%), suggests the method may provide substantial efficiencies when compared to traditional procedures. The best fitting equations in this study used the traditional form Y=a×DBHb, where Y is biomass, DBH is diameter at breast height, and a and b are both species-specific parameters. Species-specific equations of the form Y=a(BA×H), where Y is biomass, BA is tree basal area, H is tree height, and a is a species-specific parameter, fit almost as well. Comparison with previously published equations indicated errors from -33% to +29% would have occurred using off-site relationships. We also present equations for stemwood, twigs, and foliage as biomass components.
Resumo:
This paper discusses models, associations and causation in psychiatry. The different types of association (linear, positive, negative, exponential, partial, U shaped relationship, hidden and spurious) between variables involved in mental disorders are presented as well as the use of multiple regression analysis to disentangle interrelatedness amongst multiple variables. A useful model should have internal consistency, external validity and predictive power; be dynamic in order to accommodate new sound knowledge; and should fit facts rather than they other way around. It is argued that whilst models are theoretical constructs they also convey a style of reasoning and can change clinical practice. Cause and effect are complex phenomena in that the same cause can yield different effects. Conversely, the same effect can have a different range of causes. In mental disorders and human behaviour there is always a chain of events initiated by the indirect and remote cause; followed by intermediate causes; and finally the direct and more immediate cause. Causes of mental disorders are grouped as those: (i) which are necessary and sufficient; (ii) which are necessary but not sufficient; and (iii) which are neither necessary nor sufficient, but when present increase the risk for mental disorders.
Resumo:
OBJECTIVES: To investigate feasibility and easiness of administration of a brief and simple instrument addressing impairment associated with adult attention deficit hyperactivity disorder (ADHD) and if ADHD subtypes were correlated to specific profiles of self-reported impairment. METHODS: Thirty-five adults (19 men and 16 women; mean age of 31.74 years) diagnosed with ADHD according to DSM-IV with a semi-structured interview (K-SADS PL) were asked to fill out a Likert scale covering six different functional areas (academic, professional, marital, familiar, social and daily activities). Clinicians questioned patients about their understanding of the questionnaire and investigated their answers in more details to check consistency of their answers. RESULTS: No patient reported difficulties in understanding the questionnaire. Further questioning of patients' answers confirmed their choices in the six areas. Academic burden had the highest average score in the whole sample, followed by professional burden. Social area had the lowest average score in this sample.
Resumo:
Objective This study describes the development of two updating measures of working memory (WM): Letter Updating Test (LUT) and Word Updating Test (WUT). Methods In stage 1, items were created and the instruments were assessed by experts and laymen. In stage 2, tests were given to 15 patients with schizophrenia and 15 paired controls. All were able to understand and respond to the instruments. In stage 3, 141 patients with schizophrenia and 119 healthy controls aged 18 to 60 took part; they were assessed on WM, processing speed (PS) and functional outcome. Results The results showed adequate rates of internal consistency for both measures developed, for both the total sample and each group separately, as well as evidence of convergent validity, discriminant validity and sensitivity to differentiate performance among the groups. Principal component analysis yielded two components, one for updating tests and other for PS measures, indicating factorial validity. Positive and significant, yet low, correlations were found with functionality measures. Conclusion These results provide adequate psychometric parameters for the measures developed, applicable to cognitive research settings in schizophrenia.
Resumo:
Background: The Neonatal Behavioral Assessment Scale (NBAS, Brazelton & Nugent, 1995) is an instrument conceived to observe the neonatal neurobehavior. Data analysis is usually performed by organizing items into groups. The most widely used data reduction for the NBAS was developed by Lester, Als, and Brazelton (1982). Objective: Examine the psychometric properties of the NBAS items in a sample of 213 Portuguese infants. Method: The NBAS was performed in the first week of infant life (3 days±2) and in the seventh week of life (52 days±5). Results: Principal component analyses yielded a solution of four components explaining 55.13% of total variance. Construct validity was supported by better neurobehavioral performance of 7-week-old infants compared with 1-week-old infants. Conclusion: Changes in the NBAS structure for the Portuguese sample are suggested compared to Lester factors in order to reach better internal consistency of the scale.
Resumo:
The Experiences in Close Relationships Inventory permits to evaluate attachment in close relationships during adulthood based on two dimensions able to be present in this kind of relationships: the avoidance of proximity and the anxiety related with to abandonment. It is a self-report 7- points likert scale composed by 36 items. The Portuguese version was administered to a sample of 551 university students (60% female), the majority with ages between 19 and 24 years old (88%) in a dating relationship (86%). The principal components analysis with oblimin rotation was performed. The total scale has good internal consistency (α=.86), as also has the 2 sub-scales: anxiety (α=.86) and avoidance (α=.88). The two dimensions evaluated are significantly correlated with socio-demographics, relational characteristics (jealousy, relationship distress, and compromise), wishes (enmeshment versus differentiation) and fears (abandonment versus control) related to attitudes in significant relationships, which testify the construct validity of the instrument. The results obtained are coherent with the original version and other ECR‘s adaptations. Practitioners and researchers in the context of clinical psychology and related areas have now at their disposal the Portuguese version of the ECR inventory, which has shown its very high usefulness in the study of close relationships, and specifically attachment in adulthood.
Resumo:
PURPOSE: To evaluate left ventricular mass (LVM) index in hypertensive and normotensive obese individuals. METHODS: Using M mode echocardiography, 544 essential hypertensive and 106 normotensive patients were evaluated, and LVM was indexed for body surface area (LVM/BSA) and for height² (LVM/h²). The 2 indexes were then compared in both populations, in subgroups stratified according to body mass index (BMI): <27; 27-30; > or = 30kg/m². RESULTS: The BSA index does not allow identification of significant differences between BMI subgroups. Indexing by height² provides significantly increased values for high BMI subgroups in normotensive and hypertensive populations. CONCLUSION: Left ventricular hypertrophy (LVH) has been underestimated in the obese with the use of LVM/BSA because this index considers obesity as a physiological variable. Indexing by height² allows differences between BMI subgroups to become apparent and seems to be more appropriate for detecting LVH in obese populations.
Resumo:
Abstract Background: Studies have shown the impact of atrial fibrillation (AF) on the patients' quality of life. Specific questionnaires enable the evaluation of relevant events. We previously developed a questionnaire to assess the quality of life of patients with AF (AFQLQ version 1), which was reviewed in this study, and new domains were added. Objective: To demonstrate the reproducibility of the AFQLQ version 2 (AFQLQ v.2), which included the domains of fatigue, illness perception and well-being. Methods: We applied 160 questionnaires (AFQLQ v.2 and SF-36) to 40 patients, at baseline and 15 days after, to measure inter- and intraobserver reproducibility. The analysis of quality of life stability was determined by test-retest, applying the Bartko intraclass correlation coefficient (ICC). Internal consistency was assessed by Cronbach's alpha test. Results: The total score of the test-retest (n = 40) had an ICC of 0.98 in the AFQLQ v.2, and of 0.94 in the SF36. In assessing the intra- and interobserver reproducibility of the AFQLQ v.2, the ICC reliability was 0.98 and 0.97, respectively. The internal consistency had a Cronbach's alpha coefficient of 0.82, compatible with good agreement of the AFQLQ v.2. Conclusion: The AFQLQ v.2 performed better than its previous version. Similarly, the domains added contributed to make it more comprehensive and robust to assess the quality of life of patients with AF.
Resumo:
The appeal to ideas as causal variables and/or constitutive features of political processes increasingly characterises political analysis. Yet, perhaps because of the pace of this ideational intrusion, too often ideas have simply been grafted onto pre-existing explanatory theories at precisely the point at which they seem to get into difficulties, with little or no consideration either of the status of such ideational variables or of the character or consistency of the resulting theoretical hybrid. This is particularly problematic for ideas are far from innocent variables – and can rarely, if ever, be incorporated seamlessly within existing explanatory and/or constitutive theories without ontological and epistemological consequence. We contend that this tendency along with the limitations of the prevailing Humean conception of causality, and associated epistemological polemic between causal and constitutive logics, continue to plague almost all of the literature that strives to accord an explanatory role to ideas. In trying to move beyond the current vogue for epistemological polemic, we argue that the incommensurability thesis between causal and constitutive logics is only credible in the context of a narrow, Humean, conception of causation. If we reject this in favour of a more inclusive (and ontologically realist) understanding then it is perfectly possible to chart the causal significance of constitutive processes and reconstrue the explanatory role of ideas as causally constitutive.
Resumo:
Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.
Resumo:
Recent attempts to incorporate optimal fiscal policy into New Keynesian models subject to nominal inertia, have tended to assume that policy makers are benevolent and have access to a commitment technology. A separate literature, on the New Political Economy, has focused on real economies where there is strategic use of policy instruments in a world of political conflict. In this paper we combine these literatures and assume that policy is set in a New Keynesian economy by one of two policy makers facing electoral uncertainty (in terms of infrequent elections and an endogenous voting mechanism). The policy makers generally share the social welfare function, but differ in their preferences over fiscal expenditure (in its size and/or composition). Given the environment, policy shall be realistically constrained to be time-consistent. In a sticky-price economy, such heterogeneity gives rise to the possibility of one policy maker utilising (nominal) debt strategically to tie the hands of the other party, and influence the outcome of any future elections. This can give rise to a deficit bias, implying a sub-optimally high level of steady-state debt, and can also imply a sub-optimal response to shocks. The steady-state distortions and inflation bias this generates, combined with the volatility induced by the electoral cycle in a sticky-price environment, can significantly
Resumo:
In this paper we study decision making in situations where the individual’s preferences are not assumed to be complete. First, we identify conditions that are necessary and sufficient for choice behavior in general domains to be consistent with maximization of a possibly incomplete preference relation. In this model of maximally dominant choice, the agent defers/avoids choosing at those and only those menus where a most preferred option does not exist. This allows for simple explanations of conflict-induced deferral and choice overload. It also suggests a criterion for distinguishing between indifference and incomparability based on observable data. A simple extension of this model also incorporates decision costs and provides a theoretical framework that is compatible with the experimental design that we propose to elicit possibly incomplete preferences in the lab. The design builds on the introduction of monetary costs that induce choice of a most preferred feasible option if one exists and deferral otherwise. Based on this design we found evidence suggesting that a quarter of the subjects in our study had incomplete preferences, and that these made significantly more consistent choices than a group of subjects who were forced to choose. The latter effect, however, is mitigated once data on indifferences are accounted for.
Resumo:
Time-inconsistency is an essential feature of many policy problems (Kydland and Prescott, 1977). This paper presents and compares three methods for computing Markov-perfect optimal policies in stochastic nonlinear business cycle models. The methods considered include value function iteration, generalized Euler-equations, and parameterized shadow prices. In the context of a business cycle model in which a scal authority chooses government spending and income taxation optimally, while lacking the ability to commit, we show that the solutions obtained using value function iteration and generalized Euler equations are somewhat more accurate than that obtained using parameterized shadow prices. Among these three methods, we show that value function iteration can be applied easily, even to environments that include a risk-sensitive scal authority and/or inequality constraints on government spending. We show that the risk-sensitive scal authority lowers government spending and income-taxation, reducing the disincentive households face to accumulate wealth.
Resumo:
OBJECTIVES: Advances in biopsychosocial science have underlined the importance of taking social history and life course perspective into consideration in primary care. For both clinical and research purposes, this study aims to develop and validate a standardised instrument measuring both material and social deprivation at an individual level. METHODS: We identified relevant potential questions regarding deprivation using a systematic review, structured interviews, focus group interviews and a think-aloud approach. Item response theory analysis was then used to reduce the length of the 38-item questionnaire and derive the deprivation in primary care questionnaire (DiPCare-Q) index using data obtained from a random sample of 200 patients during their planned visits to an ambulatory general internal medicine clinic. Patients completed the questionnaire a second time over the phone 3 days later to enable us to assess reliability. Content validity of the DiPCare-Q was then assessed by 17 general practitioners. Psychometric properties and validity of the final instrument were investigated in a second set of patients. The DiPCare-Q was administered to a random sample of 1898 patients attending one of 47 different private primary care practices in western Switzerland along with questions on subjective social status, education, source of income, welfare status and subjective poverty. RESULTS: Deprivation was defined in three distinct dimensions: material (eight items), social (five items) and health deprivation (three items). Item consistency was high in both the derivation (Kuder-Richardson Formula 20 (KR20) =0.827) and the validation set (KR20 =0.778). The DiPCare-Q index was reliable (interclass correlation coefficients=0.847) and was correlated to subjective social status (r(s)=-0.539). CONCLUSION: The DiPCare-Q is a rapid, reliable and validated instrument that may prove useful for measuring both material and social deprivation in primary care.