882 resultados para Results assessment
Resumo:
In recent years, there has been an increased attention towards the composition of feeding fats. In the aftermath of the BSE crisis all animal by-products utilised in animal nutrition have been subjected to close scrutiny. Regulation requires that the material belongs to the category of animal by-products fit for human consumption. This implies the use of reliable techniques in order to insure the safety of products. The feasibility of using rapid and non-destructive methods, to control the composition of feedstuffs on animal fats has been studied. Fourier Transform Raman spectroscopy has been chosen for its advantage to give detailed structural information. Data were treated using chemometric methods as PCA and PLS-DA which have permitted to separate well the different classes of animal fats. The same methodology was applied on fats from various types of feedstock and production technology processes. PLS-DA model for the discrimination of animal fats from the other categories presents a sensitivity and a specificity of 0.958 and 0.914, respectively. These results encourage the use of FT-Raman spectroscopy to discriminate animal fats.
Resumo:
Educational institutions are considered a keystone for the establishment of a meritocratic society. They supposedly serve two functions: an educational function that promotes learning for all, and a selection function that sorts individuals into different programs, and ultimately social positions, based on individual merit. We study how the function of selection relates to support for assessment practices known to harm vs. benefit lower status students, through the perceived justice principles underlying these practices. We study two assessment practices: normative assessment-focused on ranking and social comparison, known to hinder the success of lower status students-and formative assessment-focused on learning and improvement, known to benefit lower status students. Normative assessment is usually perceived as relying on an equity principle, with rewards being allocated based on merit and should thus appear as positively associated with the function of selection. Formative assessment is usually perceived as relying on corrective justice that aims to ensure equality of outcomes by considering students' needs, which makes it less suitable for the function of selection. A questionnaire measuring these constructs was administered to university students. Results showed that believing that education is intended to select the best students positively predicts support for normative assessment, through increased perception of its reliance on equity, and negatively predicts support for formative assessment, through reduced perception of its ability to establish corrective justice. This study suggests that the belief in the function of selection as inherent to educational institutions can contribute to the reproduction of social inequalities by preventing change from assessment practices known to disadvantage lowerstatus student, namely normative assessment, to more favorable practices, namely formative assessment, and by promoting matching beliefs in justice principles.
Resumo:
Työssä testataan radikaalien teknologioiden liiketoimintariskien evaluointityökalua. Työn teoreettinen tausta koostuu teknologia- ja innovaatioteorioista hyödyntäen myös resurssipohjaista yritysteoriaa täydennettynä evolutionäärisellä teorialla. Teoreettisessa osuudessa rakennetaan viitekehys, jolla liiketoimintariskejä voidaan arvioida ja muodostaa riskiprofiili. Liiketoimintariskien muuttujina ovat markkina-, teknologia- ja organisaatioriskit. Primäärisenä tietolähteenä käytettiin teema- ja strukturoituja haastatteluita. Ensimmäinen haastattelu käsitti evaluointityökalun käytettävyyttä ja riskienhallintaa yleensä. Loput haastattelut liittyivät teknologian A ja teknologian B liiketoimintariskien arvioimiseen. Tulokset osoittavat molemmat teknologiat sisältävän radikaaleille teknologioille ominaisia epävarmuustekijöitä. Riskiprofiilin hyödyllisyys liittyy liiketoimintariskien samanaikaiseen havaitsemiseen auttaen näin päätöksenteossa. Tärkeää evaluoinnissa on kiinnittää huomiota näkökulmaan, josta riskejä tarkastellaan riskiprofiilin validiteetin parantamiseksi.
Resumo:
BACKGROUND: In a high proportion of patients with favorable outcome after aneurysmal subarachnoid hemorrhage (aSAH), neuropsychological deficits, depression, anxiety, and fatigue are responsible for the inability to return to their regular premorbid life and pursue their professional careers. These problems often remain unrecognized, as no recommendations concerning a standardized comprehensive assessment have yet found entry into clinical routines. METHODS: To establish a nationwide standard concerning a comprehensive assessment after aSAH, representatives of all neuropsychological and neurosurgical departments of those eight Swiss centers treating acute aSAH have agreed on a common protocol. In addition, a battery of questionnaires and neuropsychological tests was selected, optimally suited to the deficits found most prevalent in aSAH patients that was available in different languages and standardized. RESULTS: We propose a baseline inpatient neuropsychological screening using the Montreal Cognitive Assessment (MoCA) between days 14 and 28 after aSAH. In an outpatient setting at 3 and 12 months after bleeding, we recommend a neuropsychological examination, testing all relevant domains including attention, speed of information processing, executive functions, verbal and visual learning/memory, language, visuo-perceptual abilities, and premorbid intelligence. In addition, a detailed assessment capturing anxiety, depression, fatigue, symptoms of frontal lobe affection, and quality of life should be performed. CONCLUSIONS: This standardized neuropsychological assessment will lead to a more comprehensive assessment of the patient, facilitate the detection and subsequent treatment of previously unrecognized but relevant impairments, and help to determine the incidence, characteristics, modifiable risk factors, and the clinical course of these impairments after aSAH.
Resumo:
BACKGROUND: Port-wine stains (PWS) are malformations of capillaries in 0.3% of newborn children. The treatment of choice is by pulsed dye LASER (PDL), and requires several sessions. The efficacy of this treatment is at present evaluated on the basis of clinical inspection and of digital photographs taken throughout the treatment. LASER-Doppler imaging (LDI) is a noninvasive method of imaging the perfusion of the tissues by the microcirculatory system (capillaries). The aim of this paper is to demonstrate that LDI allows a quantitative, numerical evaluation of the efficacy of the PDL treatment of PWS. METHOD: The PDL sessions were organized according to the usual scheme, every other month, from September 1, 2012, to September 30, 2013. LDI imaging was performed at the start and at the conclusion of the PDL treatment, and simultaneously on healthy skin in order to obtain reference values. The results evidenced by LDI were analyzed according to the "Wilcoxon signed-rank" test before and after each session, and in the intervals between the three PDL treatment sessions. RESULTS: Our prospective study is based on 20 new children. On average, the vascularization of the PWS was reduced by 56% after three laser sessions. Compared with healthy skin, initial vascularization of PWS was 62% higher than that of healthy skin at the start of treatment, and 6% higher after three sessions. During the 2 months between two sessions, vascularization of the capillary network increased by 27%. CONCLUSION: This study shows that LDI can demonstrate and measure the efficacy of PDL treatment of PWS in children. The figures obtained when measuring the results by LDI corroborate the clinical assessments and may allow us to refine, and perhaps even modify, our present use of PDL and thus improve the efficacy of the treatment.
Resumo:
BACKGROUND: For free-breathing cardiovascular magnetic resonance (CMR), the self-navigation technique recently emerged, which is expected to deliver high-quality data with a high success rate. The purpose of this study was to test the hypothesis that self-navigated 3D-CMR enables the reliable assessment of cardiovascular anatomy in patients with congenital heart disease (CHD) and to define factors that affect image quality. METHODS: CHD patients ≥2 years-old and referred for CMR for initial assessment or for a follow-up study were included to undergo a free-breathing self-navigated 3D CMR at 1.5T. Performance criteria were: correct description of cardiac segmental anatomy, overall image quality, coronary artery visibility, and reproducibility of great vessels diameter measurements. Factors associated with insufficient image quality were identified using multivariate logistic regression. RESULTS: Self-navigated CMR was performed in 105 patients (55% male, 23 ± 12y). Correct segmental description was achieved in 93% and 96% for observer 1 and 2, respectively. Diagnostic quality was obtained in 90% of examinations, and it increased to 94% if contrast-enhanced. Left anterior descending, circumflex, and right coronary arteries were visualized in 93%, 87% and 98%, respectively. Younger age, higher heart rate, lower ejection fraction, and lack of contrast medium were independently associated with reduced image quality. However, a similar rate of diagnostic image quality was obtained in children and adults. CONCLUSION: In patients with CHD, self-navigated free-breathing CMR provides high-resolution 3D visualization of the heart and great vessels with excellent robustness.
Resumo:
BACKGROUND: Our retrospective, international study aimed at evaluating the activity and safety of eribulin mesylate (EM) in pretreated metastatic breast cancer (MBC) in a routine clinical setting. METHODS: Patients treated with EM for a locally advanced or MBC between March 2011 and January 2014 were included in the study. Clinical and biological assessment of toxicity was performed at each visit. Tumour response was assessed every 3 cycles of treatment. A database was created to collect clinical, pathological and treatment data. RESULTS: Two hundred and fifty-eight patients were included in the study. Median age was 59 years old. Tumours were Hormone Receptor (HR)-positive (73.3 %) HER2-positive (10.2 %), and triple negative (TN, 22.5 %). 86.4 % of the patients presented with visceral metastases, mainly in the liver (67.4 %). Median previous metastatic chemotherapies number was 4 [1-9]. Previous treatments included anthracyclines and/or taxanes (100 %) and capecitabine (90.7 %). Median number of EM cycles was 5 [1-19]. The relative dose intensity was 0.917. At the time of analysis (median follow-up of 13.9 months), 42.3 % of the patients were still alive. The objective response rate was 25.2 % (95 %CI: 20-31) with a 36.1 % clinical benefit rate (CBR). Median time to progression (TTP) and overall survival were 3.97 (95 %CI: 3.25-4.3) and 11.2 (95 %CI: 9.3-12.1) months, respectively. One- and 2-year survival rates were 45.5 and 8.5 %, respectively. In multivariate analysis, HER2 positivity (HR = 0.29), the presence of lung metastases (HR = 2.49) and primary taxanes resistance (HR = 2.36) were the only three independent CBR predictive factors, while HR positivity (HR = 0.67), the presence of lung metastases (HR = 1.52) and primary taxanes resistance (HR = 1.50) were the only three TTP independent prognostic factors. Treatment was globally well tolerated. Most common grade 3-4 toxicities were neutropenia (20.9 %), peripheral neuropathy (3.9 %), anaemia (1.6 %), liver dysfunction (0.8 %) and thrombocytopenia (0.4 %). Thirteen patients (5 %) developed febrile neutropenia. CONCLUSION: EM is an effective new option in heavily pretreated MBC, with a favourable efficacy/safety ratio in a clinical practice setting. Our results comfort the use of this new molecule and pledge for the evaluation of EM-trastuzumab combination in this setting. Tumour biology, primary taxanes sensitivity and metastatic sites could represent useful predictive and prognostic factors.
Resumo:
It is axiomatic that our planet is extensively inhabited by diverse micro-organisms such as bacteria, yet the absolute diversity of different bacterial species is widely held to be unknown. Different bacteria can be found from the depths of the oceans to the top of the mountains; even the air is more or less colonized by bacteria. Most bacteria are either harmless or even advantageous to human beings but there are also bacteria, which can cause severe infectious diseases or spoil the supplies intended for human consumption. Therefore, it is vitally important not only to be able to detect and enumerate bacteria but also to assess their viability and possible harmfulness. Whilst the growth of bacteria is remarkably fast under optimum conditions and easy to detect by cultural methods, most bacteria are believed to lie in stationary phase of growth in which the actual growth is ceased and thus bacteria may simply be undetectable by cultural techniques. Additionally, several injurious factors such as low and high temperature or deficiency of nutrients can turn bacteria into a viable but non-culturable state (VBNC) that cannot be detected by cultural methods. Thereby, various noncultural techniques developed for the assessment of bacterial viability and killing have widely been exploited in modern microbiology. However, only a few methods are suitable for kinetic measurements, which enable the real-time detection of bacterial growth and viability. The present study describes alternative methods for measuring bacterial viability and killing as well as detecting the effects of various antimicrobial agents on bacteria on a real-time basis. The suitability of bacterial (lux) and beetle (luc) luciferases as well as green fluorescent protein (GFP) to act as a marker of bacterial viability and cell growth was tested. In particular, a multiparameter microplate assay based on GFP-luciferase combination as well as a flow cytometric measurement based on GFP-PI combination were developed to perform divergent viability analyses. The results obtained suggest that the antimicrobial activities of various drugs against bacteria could be successfully measured using both of these methods. Specifically, the data reliability of flow cytometric viability analysis was notably improved as GFP was utilized in the assay. A fluoro-luminometric microplate assay enabled kinetic measurements, which significantly improved and accelerated the assessment of bacterial viability compared to more conventional viability assays such as plate counting. Moreover, the multiparameter assay made simultaneous detection of GFP fluorescence and luciferase bioluminescence possible and provided extensive information about multiple cellular parameters in single assay, thereby increasing the accuracy of the assessment of the kinetics of antimicrobial activities on target bacteria.
Resumo:
Characterizing the geological features and structures in three dimensions over inaccessible rock cliffs is needed to assess natural hazards such as rockfalls and rockslides and also to perform investigations aimed at mapping geological contacts and building stratigraphy and fold models. Indeed, the detailed 3D data, such as LiDAR point clouds, allow to study accurately the hazard processes and the structure of geologic features, in particular in vertical and overhanging rock slopes. Thus, 3D geological models have a great potential of being applied to a wide range of geological investigations both in research and applied geology projects, such as mines, tunnels and reservoirs. Recent development of ground-based remote sensing techniques (LiDAR, photogrammetry and multispectral / hyperspectral images) are revolutionizing the acquisition of morphological and geological information. As a consequence, there is a great potential for improving the modeling of geological bodies as well as failure mechanisms and stability conditions by integrating detailed remote data. During the past ten years several large rockfall events occurred along important transportation corridors where millions of people travel every year (Switzerland: Gotthard motorway and railway; Canada: Sea to sky highway between Vancouver and Whistler). These events show that there is still a lack of knowledge concerning the detection of potential rockfalls, making mountain residential settlements and roads highly risky. It is necessary to understand the main factors that destabilize rocky outcrops even if inventories are lacking and if no clear morphological evidences of rockfall activity are observed. In order to increase the possibilities of forecasting potential future landslides, it is crucial to understand the evolution of rock slope stability. Defining the areas theoretically most prone to rockfalls can be particularly useful to simulate trajectory profiles and to generate hazard maps, which are the basis for land use planning in mountainous regions. The most important questions to address in order to assess rockfall hazard are: Where are the most probable sources for future rockfalls located? What are the frequencies of occurrence of these rockfalls? I characterized the fracturing patterns in the field and with LiDAR point clouds. Afterwards, I developed a model to compute the failure mechanisms on terrestrial point clouds in order to assess the susceptibility to rockfalls at the cliff scale. Similar procedures were already available to evaluate the susceptibility to rockfalls based on aerial digital elevation models. This new model gives the possibility to detect the most susceptible rockfall sources with unprecented detail in the vertical and overhanging areas. The results of the computation of the most probable rockfall source areas in granitic cliffs of Yosemite Valley and Mont-Blanc massif were then compared to the inventoried rockfall events to validate the calculation methods. Yosemite Valley was chosen as a test area because it has a particularly strong rockfall activity (about one rockfall every week) which leads to a high rockfall hazard. The west face of the Dru was also chosen for the relevant rockfall activity and especially because it was affected by some of the largest rockfalls that occurred in the Alps during the last 10 years. Moreover, both areas were suitable because of their huge vertical and overhanging cliffs that are difficult to study with classical methods. Limit equilibrium models have been applied to several case studies to evaluate the effects of different parameters on the stability of rockslope areas. The impact of the degradation of rockbridges on the stability of large compartments in the west face of the Dru was assessed using finite element modeling. In particular I conducted a back-analysis of the large rockfall event of 2005 (265'000 m3) by integrating field observations of joint conditions, characteristics of fracturing pattern and results of geomechanical tests on the intact rock. These analyses improved our understanding of the factors that influence the stability of rock compartments and were used to define the most probable future rockfall volumes at the Dru. Terrestrial laser scanning point clouds were also successfully employed to perform geological mapping in 3D, using the intensity of the backscattered signal. Another technique to obtain vertical geological maps is combining triangulated TLS mesh with 2D geological maps. At El Capitan (Yosemite Valley) we built a georeferenced vertical map of the main plutonio rocks that was used to investigate the reasons for preferential rockwall retreat rate. Additional efforts to characterize the erosion rate were made at Monte Generoso (Ticino, southern Switzerland) where I attempted to improve the estimation of long term erosion by taking into account also the volumes of the unstable rock compartments. Eventually, the following points summarize the main out puts of my research: The new model to compute the failure mechanisms and the rockfall susceptibility with 3D point clouds allows to define accurately the most probable rockfall source areas at the cliff scale. The analysis of the rockbridges at the Dru shows the potential of integrating detailed measurements of the fractures in geomechanical models of rockmass stability. The correction of the LiDAR intensity signal gives the possibility to classify a point cloud according to the rock type and then use this information to model complex geologic structures. The integration of these results, on rockmass fracturing and composition, with existing methods can improve rockfall hazard assessments and enhance the interpretation of the evolution of steep rockslopes. -- La caractérisation de la géologie en 3D pour des parois rocheuses inaccessibles est une étape nécessaire pour évaluer les dangers naturels tels que chutes de blocs et glissements rocheux, mais aussi pour réaliser des modèles stratigraphiques ou de structures plissées. Les modèles géologiques 3D ont un grand potentiel pour être appliqués dans une vaste gamme de travaux géologiques dans le domaine de la recherche, mais aussi dans des projets appliqués comme les mines, les tunnels ou les réservoirs. Les développements récents des outils de télédétection terrestre (LiDAR, photogrammétrie et imagerie multispectrale / hyperspectrale) sont en train de révolutionner l'acquisition d'informations géomorphologiques et géologiques. Par conséquence, il y a un grand potentiel d'amélioration pour la modélisation d'objets géologiques, ainsi que des mécanismes de rupture et des conditions de stabilité, en intégrant des données détaillées acquises à distance. Pour augmenter les possibilités de prévoir les éboulements futurs, il est fondamental de comprendre l'évolution actuelle de la stabilité des parois rocheuses. Définir les zones qui sont théoriquement plus propices aux chutes de blocs peut être très utile pour simuler les trajectoires de propagation des blocs et pour réaliser des cartes de danger, qui constituent la base de l'aménagement du territoire dans les régions de montagne. Les questions plus importantes à résoudre pour estimer le danger de chutes de blocs sont : Où se situent les sources plus probables pour les chutes de blocs et éboulement futurs ? Avec quelle fréquence vont se produire ces événements ? Donc, j'ai caractérisé les réseaux de fractures sur le terrain et avec des nuages de points LiDAR. Ensuite, j'ai développé un modèle pour calculer les mécanismes de rupture directement sur les nuages de points pour pouvoir évaluer la susceptibilité au déclenchement de chutes de blocs à l'échelle de la paroi. Les zones sources de chutes de blocs les plus probables dans les parois granitiques de la vallée de Yosemite et du massif du Mont-Blanc ont été calculées et ensuite comparés aux inventaires des événements pour vérifier les méthodes. Des modèles d'équilibre limite ont été appliqués à plusieurs cas d'études pour évaluer les effets de différents paramètres sur la stabilité des parois. L'impact de la dégradation des ponts rocheux sur la stabilité de grands compartiments de roche dans la paroi ouest du Petit Dru a été évalué en utilisant la modélisation par éléments finis. En particulier j'ai analysé le grand éboulement de 2005 (265'000 m3), qui a emporté l'entier du pilier sud-ouest. Dans le modèle j'ai intégré des observations des conditions des joints, les caractéristiques du réseau de fractures et les résultats de tests géoméchaniques sur la roche intacte. Ces analyses ont amélioré l'estimation des paramètres qui influencent la stabilité des compartiments rocheux et ont servi pour définir des volumes probables pour des éboulements futurs. Les nuages de points obtenus avec le scanner laser terrestre ont été utilisés avec succès aussi pour produire des cartes géologiques en 3D, en utilisant l'intensité du signal réfléchi. Une autre technique pour obtenir des cartes géologiques des zones verticales consiste à combiner un maillage LiDAR avec une carte géologique en 2D. A El Capitan (Yosemite Valley) nous avons pu géoréferencer une carte verticale des principales roches plutoniques que j'ai utilisé ensuite pour étudier les raisons d'une érosion préférentielle de certaines zones de la paroi. D'autres efforts pour quantifier le taux d'érosion ont été effectués au Monte Generoso (Ticino, Suisse) où j'ai essayé d'améliorer l'estimation de l'érosion au long terme en prenant en compte les volumes des compartiments rocheux instables. L'intégration de ces résultats, sur la fracturation et la composition de l'amas rocheux, avec les méthodes existantes permet d'améliorer la prise en compte de l'aléa chute de pierres et éboulements et augmente les possibilités d'interprétation de l'évolution des parois rocheuses.
Resumo:
We have investigated the behavior of bistable cells made up of four quantum dots and occupied by two electrons, in the presence of realistic confinement potentials produced by depletion gates on top of a GaAs/AlGaAs heterostructure. Such a cell represents the basic building block for logic architectures based on the concept of quantum cellular automata (QCA) and of ground state computation, which have been proposed as an alternative to traditional transistor-based logic circuits. We have focused on the robustness of the operation of such cells with respect to asymmetries derived from fabrication tolerances. We have developed a two-dimensional model for the calculation of the electron density in a driven cell in response to the polarization state of a driver cell. Our method is based on the one-shot configuration-interaction technique, adapted from molecular chemistry. From the results of our simulations, we conclude that an implementation of QCA logic based on simple ¿hole arrays¿ is not feasible, because of the extreme sensitivity to fabrication tolerances. As an alternative, we propose cells defined by multiple gates, where geometrical asymmetries can be compensated for by adjusting the bias voltages. Even though not immediately applicable to the implementation of logic gates and not suitable for large scale integration, the proposed cell layout should allow an experimental demonstration of a chain of QCA cells.
Resumo:
Human biomonitoring (HBM) is an effective tool for assessing actual exposure to chemicals that takes into account all routes of intake. Although hair analysis is considered to be an optimal biomarker for assessing mercury exposure, the lack of harmonization as regards sampling and analytical procedures has often limited the comparison of data at national and international level. The European-funded projects COPHES and DEMOCOPHES developed and tested a harmonized European approach to Human Biomonitoring in response to the European Environment and Health Action Plan. Herein we describe the quality assurance program (QAP) for assessing mercury levels in hair samples from more than 1800 mother-child pairs recruited in 17 European countries. To ensure the comparability of the results, standard operating procedures (SOPs) for sampling and for mercury analysis were drafted and distributed to participating laboratories. Training sessions were organized for field workers and four external quality-assessment exercises (ICI/EQUAS), followed by the corresponding web conferences, were organized between March 2011 and February 2012. ICI/EQUAS used native hair samples at two mercury concentration ranges (0.20-0.71 and 0.80-1.63) per exercise. The results revealed relative standard deviations of 7.87-13.55% and 4.04-11.31% for the low and high mercury concentration ranges, respectively. A total of 16 out of 18 participating laboratories the QAP requirements and were allowed to analyze samples from the DEMOCOPHES pilot study. Web conferences after each ICI/EQUAS revealed this to be a new and effective tool for improving analytical performance and increasing capacity building. The procedure developed and tested in COPHES/DEMOCOPHES would be optimal for application on a global scale as regards implementation of the Minamata Convention on Mercury.
Resumo:
Here we present results of studies conducted by the Research Unit of Legal Psychiatry and Psychology of Lausanne about risk assessment and protective factors in the evaluation of violence recidivism. It aims to help experts in considering the relevance and use of tools at their disposal. Particular attention is given to the significance of protective factors and impulsive dimensions, as to the inter-raters process that leads to the final deliberations.
Resumo:
The number of qualitative research methods has grown substantially over the last twenty years, both in social sciences and, more recently, in the health sciences. This growth came with questions on the quality criteria needed to evaluate this work, and numerous guidelines were published. The latters include many discrepancies though, both in their vocabulary and construction. Many expert evaluators decry the absence of consensual and reliable evaluation tools. The authors present the results of an evaluation of 58 existing guidelines in 4 major health science fields (medicine and epidemiology; nursing and health education; social sciences and public health; psychology / psychiatry, research methods and organization) by expert users (article reviewers, experts allocating funds, editors, etc.). The results propose a toolbox containing 12 consensual criteria with the definitions given by expert users. They also indicate in which disciplinary field each type of criteria is known to be more or less essential. Nevertheless, the authors highlight the limitations of the criteria comparability, as soon as one focuses on their specific definitions. They conclude that each criterion in the toolbox must be explained to come to broader consensus and identify definitions that are consensual to all the fields examined and easily operational.
Resumo:
The effective notch stress approach for the fatigue strength assessment of welded structures as included in the Fatigue Design Recommendation of the IIW requires the numerical analysis of the elastic notch stress in the weld toe and weld root which is fictitiously rounded with a radius of 1mm. The goal of this thesis work was to consider alternate meshing strategies when using the effective notch stress approach to assess the fatigue strength of load carrying partial penetration fillet-welded cruciform joints. In order to establish guidelines for modeling the joint and evaluating the results, various two-dimensional (2D) finite element analyses were carried out by systematically varying the thickness of the plates, the weld throat thickness, the degree of bending, and the shape and location of the modeled effective notch. To extend the scope of this work, studies were also carried out on the influence of
Resumo:
Objective: Independently of total caloric intake, a better quality of the diet (for example, conformity to the Mediterranean diet) is associated with lower obesity risk. It is unclear whether a brief dietary assessment tool, instead of full-length comprehensive methods, can also capture this association. In addition to reduced costs, a brief tool has the interesting advantage of allowing immediate feedback to participants in interventional studies. Another relevant question is which individual items of such a brief tool are responsible for this association. We examined these associations using a 14-item tool of adherence to the Mediterranean diet as exposure and body mass index, waist circumference and waist-to-height ratio (WHtR) as outcomes. Design: Cross-sectional assessment of all participants in the"PREvención con DIeta MEDiterránea" (PREDIMED) trial. Subjects: 7,447 participants (55-80 years, 57% women) free of cardiovascular disease, but with either type 2 diabetes or $3 cardiovascular risk factors. Trained dietitians used both a validated 14-item questionnaire and a full-length validated 137-item food frequency questionnaire to assess dietary habits. Trained nurses measured weight, height and waist circumference. Results: Strong inverse linear associations between the 14-item tool and all adiposity indexes were found. For a two-point increment in the 14-item score, the multivariable-adjusted differences in WHtR were 20.0066 (95% confidence interval,- 0.0088 to 20.0049) for women and-0.0059 (-0.0079 to-0.0038) for men. The multivariable-adjusted odds ratio for a WHtR.0.6 in participants scoring $10 points versus #7 points was 0.68 (0.57 to 0.80) for women and 0.66 (0.54 to 0.80) for men. High consumption of nuts and low consumption of sweetened/carbonated beverages presented the strongest inverse associations with abdominal obesity. Conclusions: A brief 14-item tool was able to capture a strong monotonic inverse association between adherence to a good quality dietary pattern (Mediterranean diet) and obesity indexes in a population of adults at high cardiovascular risk.