793 resultados para objective modality


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The accurate assessment of dietary exposure is important in investigating associations between diet and disease. Research in nutritional epidemiology, which has resulted in a large amount of information on associations between diet and chronic diseases in the last decade, relies on accurate assessment methods to identify these associations. However, most dietary assessment instruments rely to some extent on self-reporting, which is prone to systematic bias affected by factors such as age, gender, social desirability and approval. Nutritional biomarkers are not affected by these and therefore provide an additional, alternative method to estimate intake. However, there are also some limitations in their application: they are affected by inter-individual variations in metabolism and other physiological factors, and they are often limited to estimating intake of specific compounds and not entire foods. It is therefore important to validate nutritional biomarkers to determine specific strengths and limitations. In this perspective paper, criteria for the validation of nutritional markers and future developments are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Species-based indices are frequently employed as surrogates for wider biodiversity health and measures of environmental condition. Species selection is crucial in determining an indicators metric value and hence the validity of the interpretation of ecosystem condition and function it provides, yet an objective process to identify appropriate indicator species is frequently lacking. 2. An effective indicator needs to (i) be representative, reflecting the status of wider biodiversity; (ii) be reactive, acting as early-warning systems for detrimental changes in environmental conditions; (iii) respond to change in a predictable way. We present an objective, niche-based approach for species' selection, founded on a coarse categorisation of species' niche space and key resource requirements, which ensures the resultant indicator has these key attributes. 3. We use UK farmland birds as a case study to demonstrate this approach, identifying an optimal indicator set containing 12 species. In contrast to the 19 species included in the farmland bird index (FBI), a key UK biodiversity indicator that contributes to one of the UK Government's headline indicators of sustainability, the niche space occupied by these species fully encompasses that occupied by the wider community of 62 species. 4. We demonstrate that the response of these 12 species to land-use change is a strong correlate to that of the wider farmland bird community. Furthermore, the temporal dynamics of the index based on their population trends closely matches the population dynamics of the wider community. However, in both analyses, the magnitude of the change in our indicator was significantly greater, allowing this indicator to act as an early-warning system. 5. Ecological indicators are embedded in environmental management, sustainable development and biodiversity conservation policy and practice where they act as metrics against which progress towards national, regional and global targets can be measured. Adopting this niche-based approach for objective selection of indicator species will facilitate the development of sensitive and representative indices for a range of taxonomic groups, habitats and spatial scales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Considerable effort is presently being devoted to producing high-resolution sea surface temperature (SST) analyses with a goal of spatial grid resolutions as low as 1 km. Because grid resolution is not the same as feature resolution, a method is needed to objectively determine the resolution capability and accuracy of SST analysis products. Ocean model SST fields are used in this study as simulated “true” SST data and subsampled based on actual infrared and microwave satellite data coverage. The subsampled data are used to simulate sampling errors due to missing data. Two different SST analyses are considered and run using both the full and the subsampled model SST fields, with and without additional noise. The results are compared as a function of spatial scales of variability using wavenumber auto- and cross-spectral analysis. The spectral variance at high wavenumbers (smallest wavelengths) is shown to be attenuated relative to the true SST because of smoothing that is inherent to both analysis procedures. Comparisons of the two analyses (both having grid sizes of roughly ) show important differences. One analysis tends to reproduce small-scale features more accurately when the high-resolution data coverage is good but produces more spurious small-scale noise when the high-resolution data coverage is poor. Analysis procedures can thus generate small-scale features with and without data, but the small-scale features in an SST analysis may be just noise when high-resolution data are sparse. Users must therefore be skeptical of high-resolution SST products, especially in regions where high-resolution (~5 km) infrared satellite data are limited because of cloud cover.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Older adults often experience memory impairments, but can sometimes use selective processing and schematic support to remember important information. The current experiments investigate to what degree younger and healthy older adults remember medication side effects that were subjectively or objectively important to remember. Participants studied a list of common side effects, and rated how negative these effects were if they were to experience them, and were then given a free recall test. In Experiment 1, the severity of the side effects ranged from mild (e.g., itching) to severe (e.g., stroke), and in Experiment 2, certain side effects were indicated as critical to remember (i.e., “contact your doctor if you experience this”). There were no age differences in terms of free recall of the side effects, and older adults remembered more severe side effects relative to mild effects. However, older adults were less likely to recognize critical side effects on a later recognition test, relative to younger adults. The findings suggest that older adults can selectively remember medication side effects, but have difficulty identifying familiar but potentially critical side effects, and this has implications for monitoring medication use in older age.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. Therapeutic alliance, modality, and ability to engage with the process of therapy have been the main focus of research into what makes psychotherapy successful. Individuals with complex trauma histories or schizophrenia are suggested to be more difficult to engage and may be less likely to benefit from therapy. This study aimed to track the in-session ‘process’ of working alliance and emotional processing of trauma memories for individuals with schizophrenia. Design. The study utilized session recordings from the treatment arm of an open randomized clinical trial investigating trauma-focused cognitive behavioural therapy (TF-CBT) for individuals with schizophrenia (N = 26). Method. Observer measures of working alliance, emotional processing, and affect arousal were rated at early and late phases of therapy. Correlation analysis was undertaken for process measures. Temporal analysis of expressed emotions was also reported. Results. Working alliance was established and maintained throughout the therapy; however, agreement on goals reduced at the late phase. The participants appeared to be able to engage in emotional processing, but not to the required level for successful cognitive restructuring. Conclusion. This study undertook novel exploration of process variables not usually explored in CBT. It is also the first study of process for TF-CBT with individuals with schizophrenia. This complex clinical sample showed no difficulty in engagement; however, they may not be able to fully undertake the cognitive–emotional demands of this type of therapy. Clinical and research implications and potential limitations of these methods are considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical Mode Decomposition (EMD) is a data driven technique for extraction of oscillatory components from data. Although it has been introduced over 15 years ago, its mathematical foundations are still missing which also implies lack of objective metrics for decomposed set evaluation. Most common technique for assessing results of EMD is their visual inspection, which is very subjective. This article provides objective measures for assessing EMD results based on the original definition of oscillatory components.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extratropical transition (ET) has eluded objective identification since the realisation of its existence in the 1970s. Recent advances in numerical, computational models have provided data of higher resolution than previously available. In conjunction with this, an objective characterisation of the structure of a storm has now become widely accepted in the literature. Here we present a method of combining these two advances to provide an objective method for defining ET. The approach involves applying K-means clustering to isolate different life-cycle stages of cyclones and then analysing the progression through these stages. This methodology is then tested by applying it to five recent years from the European Centre of Medium-Range Weather Forecasting operational analyses. It is found that this method is able to determine the general characteristics for ET in the Northern Hemisphere. Between 2008 and 2012, 54% (±7, 32 of 59) of Northern Hemisphere tropical storms are estimated to undergo ET. There is great variability across basins and time of year. To fully capture all the instances of ET is necessary to introduce and characterise multiple pathways through transition. Only one of the three transition types needed has been previously well-studied. A brief description of the alternate types of transitions is given, along with illustrative storms, to assist with further study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective. To compare the nutritional value of meals provided by companies participating in the Workers` Meal Program in the city of Sao Paulo, Brazil, to the nutritional recommendations and guidelines established by the Ministry of Health for the Brazilian population. Methods. The 72 companies studied were grouped according to economic sector (industrial, services, or commerce), size (micro, small, medium, or large), meal preparation modality (prepared on-site by the company itself, on-site by a hired caterer, or off-site by a hired caterer), and supervision by a dietitian (yes or no). The per capita amount of food was determined based on the lunch, dinner, and supper menus for three days. The nutritional value of the meals was defined by the amount of calories, carbohydrates, protein, total fat, polyunsaturated fat, saturated fat, trans fat, sugars, cholesterol, and fruits and vegetables. Results. Most of the menus were deficient in the number of fruits and vegetables (63.9%) and amount of polyunsaturated fat (83.3%), but high in total fat (47.2%) and cholesterol (62.5%). Group 2, composed of mostly medium and large companies, supervised by a dietician, belonging to the industrial and/or service sectors, and using a hired caterer, on averaged served meals with higher calorie content (P < 0.001), higher percentage of polyunsaturated fat (P < 0.001), more cholesterol (P = 0.015), and more fruits and vegetables (P < 0.001) than Group 1, which was composed of micro and small companies from the commercial sector, that prepare the meals themselves on-site, and are not supervised by a dietitian. Regarding the nutrition guidelines set for the Brazilian population, Group 2 meals were better in terms of fruit and vegetable servings (P < 0.001). Group I meals were better in terms of cholesterol content (P = 0.05). Conclusions. More specific action is required targeting company officers and managers in charge of food and nutrition services, especially in companies without dietitian supervision.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of the current study was to analyze the effects of rhinoseptoplasty on internal nasal dimensions and speech resonance of individuals with unilateral cleft lip and palate, estimated by acoustic rhinometry and nasometry, respectively. Twenty-one individuals (aged 15-46 years) with previously repaired unilateral cleft lip and palate were analyzed before (PRE), and 6 to 9 (POST1) and 12 to 18 months (POST2) after surgery. Acoustic rhinometry was used to measure the cross-sectional areas (CSAs) of segments corresponding to the nasal valve (CSA1), anterior portion (CSA2), and posterior portion (CSA3) of the lower turbinate, and the volumes at the nasal valve (V1) and turbinate (V2) regions at cleft and noncleft sides, before and after nasal decongestion with a topical vasoconstrictor. Nasometry was used to evaluate speech nasalance during the reading of a set of sentences containing nasal sounds and other devoid of nasal sounds. At the cleft side, before nasal decongestion, there was a significant increase (P < 0.05) in mean CSA1 and V1 values at POST1 and POST2 compared with PRE. After decongestion, increased values were also observed for CSA2 and V2 at POST2. No significant changes were observed at the noncleft side. Mean nasalance values at PRE, POST1, an POST2 were not different from each other in both oral and nasal sentences. The measurement of CSAs and volumes by acoustic rhinometry revealed that rhinoseptoplasty provided, in most cases analyzed, a significant increase in nasal patency, without concomitant changes in speech resonance, as estimated by nasalance assessment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we present an algorithm for cluster analysis that integrates aspects from cluster ensemble and multi-objective clustering. The algorithm is based on a Pareto-based multi-objective genetic algorithm, with a special crossover operator, which uses clustering validation measures as objective functions. The algorithm proposed can deal with data sets presenting different types of clusters, without the need of expertise in cluster analysis. its result is a concise set of partitions representing alternative trade-offs among the objective functions. We compare the results obtained with our algorithm, in the context of gene expression data sets, to those achieved with multi-objective Clustering with automatic K-determination (MOCK). the algorithm most closely related to ours. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Although the oral cavity is easily accessible to inspection, patients with oral cancer most often present at a late stage, leading to high morbidity and mortality. Autofluorescence imaging has emerged as a promising technology to aid clinicians in screening for oral neoplasia and as an aid to resection, but current approaches rely on subjective interpretation. We present a new method to objectively delineate neoplastic oral mucosa using autofluorescence imaging. Autofluorescence images were obtained from 56 patients with oral lesions and 11 normal volunteers. From these images, 276 measurements from 159 unique regions of interest (ROI) sites corresponding to normal and confirmed neoplastic areas were identified. Data from ROIs in the first 46 subjects were used to develop a simple classification algorithm based on the ratio of red-to-green fluorescence; performance of this algorithm was then validated using data from the ROIs in the last 21 subjects. This algorithm was applied to patient images to create visual disease probability maps across the field of view. Histologic sections of resected tissue were used to validate the disease probability maps. The best discrimination between neoplastic and nonneoplastic areas was obtained at 405 nm excitation; normal tissue could be discriminated from dysplasia and invasive cancer with a 95.9% sensitivity and 96.2% specificity in the training set, and with a 100% sensitivity and 91.4% specificity in the validation set. Disease probability maps qualitatively agreed with both clinical impression and histology. Autofluorescence imaging coupled with objective image analysis provided a sensitive and noninvasive tool for the detection of oral neoplasia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Along the internal carotid artery (ICA), atherosclerotic plaques are often located in its cavernous sinus (parasellar) segments (pICA). Studies indicate that the incidence of pre-atherosclerotic lesions is linked with the complexity of the pICA; however, the pICA shape was never objectively characterized. Our study aims at providing objective mathematical characterizations of the pICA shape. Methods and results Three-dimensional (3D) computer models, reconstructed from contrast enhanced computed tomography (CT) data of 30 randomly selected patients (60 pICAs) were analyzed with modern visualization software and new mathematical algorithms. As objective measures for the pICA shape complexity, we provide calculations of curvature energy, torsion energy, and total complexity of 3D skeletons of the pICA lumen. We further measured the posterior knee of the so-called ""carotid siphon"" with a virtual goniometer and performed correlations between the objective mathematical calculations and the subjective angle measurements. Conclusions Firstly, our study provides mathematical characterizations of the pICA shape, which can serve as objective reference data for analyzing connections between pICA shape complexity and vascular diseases. Secondly, we provide an objective method for creating Such data. Thirdly, we evaluate the usefulness of subjective goniometric measurements of the angle of the posterior knee of the carotid siphon.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the late seventies, Megiddo proposed a way to use an algorithm for the problem of minimizing a linear function a(0) + a(1)x(1) + ... + a(n)x(n) subject to certain constraints to solve the problem of minimizing a rational function of the form (a(0) + a(1)x(1) + ... + a(n)x(n))/(b(0) + b(1)x(1) + ... + b(n)x(n)) subject to the same set of constraints, assuming that the denominator is always positive. Using a rather strong assumption, Hashizume et al. extended Megiddo`s result to include approximation algorithms. Their assumption essentially asks for the existence of good approximation algorithms for optimization problems with possibly negative coefficients in the (linear) objective function, which is rather unusual for most combinatorial problems. In this paper, we present an alternative extension of Megiddo`s result for approximations that avoids this issue and applies to a large class of optimization problems. Specifically, we show that, if there is an alpha-approximation for the problem of minimizing a nonnegative linear function subject to constraints satisfying a certain increasing property then there is an alpha-approximation (1 1/alpha-approximation) for the problem of minimizing (maximizing) a nonnegative rational function subject to the same constraints. Our framework applies to covering problems and network design problems, among others.