947 resultados para Link quality estimation
Resumo:
Throughout the world, pressures on water resources are increasing, mainly as a result of human activity. Because of their accessibility, groundwater and surface water are the most used reservoirs. The evaluation of the water quality requires the identification of the interconnections among the water reservoirs, natural landscape features, human activities and aquatic health. This study focuses on the estimation of the water pollution linked to two different environmental issues: salt water intrusion and acid mine drainage related to the exploitation of natural resources. Effects of salt water intrusion occurring in the shallow aquifer north of Ravenna (Italy) was analysed through the study of ion- exchange occurring in the area and its variance throughout the year, applying a depth-specific sampling method. In the study area were identified ion exchange, calcite and dolomite precipitation, and gypsum dissolution and sulphate reduction as the main processes controlling the groundwater composition. High concentrations of arsenic detected only at specific depth indicate its connexion with the organic matter. Acid mine drainage effects related to the tin extraction in the Bolivian Altiplano was studied, on water and sediment matrix. Water contamination results strictly dependent on the seasonal variation, on pH and redox conditions. During the dry season the strong evaporation and scarce water flow lead to low pH values, high concentrations of heavy metals in surface waters and precipitation of secondary minerals along the river, which could be released in oxidizing conditions as demonstrated through the sequential extraction analysis. The increase of the water flow during the wet season lead to an increase of pH values and a decrease in heavy metal concentrations, due to dilution effect and, as e.g. for the iron, to precipitation.
Resumo:
Despite several clinical tests that have been developed to qualitatively describe complex motor tasks by functional testing, these methods often depend on clinicians' interpretation, experience and training, which make the assessment results inconsistent, without the precision required to objectively assess the effect of the rehabilitative intervention. A more detailed characterization is required to fully capture the various aspects of motor control and performance during complex movements of lower and upper limbs. The need for cost-effective and clinically applicable instrumented tests would enable quantitative assessment of performance on a subject-specific basis, overcoming the limitations due to the lack of objectiveness related to individual judgment, and possibly disclosing subtle alterations that are not clearly visible to the observer. Postural motion measurements at additional locations, such as lower and upper limbs and trunk, may be necessary in order to obtain information about the inter-segmental coordination during different functional tests involved in clinical practice. With these considerations in mind, this Thesis aims: i) to suggest a novel quantitative assessment tool for the kinematics and dynamics evaluation of a multi-link kinematic chain during several functional motor tasks (i.e. squat, sit-to-stand, postural sway), using one single-axis accelerometer per segment, ii) to present a novel quantitative technique for the upper limb joint kinematics estimation, considering a 3-link kinematic chain during the Fugl-Meyer Motor Assessment and using one inertial measurement unit per segment. The suggested methods could have several positive feedbacks from clinical practice. The use of objective biomechanical measurements, provided by inertial sensor-based technique, may help clinicians to: i) objectively track changes in motor ability, ii) provide timely feedback about the effectiveness of administered rehabilitation interventions, iii) enable intervention strategies to be modified or changed if found to be ineffective, and iv) speed up the experimental sessions when several subjects are asked to perform different functional tests.
Resumo:
The quality of fish products is indispensably linked to the freshness of the raw material modulated by appropriate manipulation and storage conditions, specially the storage temperature after catch. The purpose of the research presented in this thesis, which was largely conducted in the context of a research project funded by Italian Ministry of Agricultural, Food and Forestry Policies (MIPAAF), concerned the evaluation of the freshness of farmed and wild fish species, in relation to different storage conditions, under ice (0°C) or at refrigeration temperature (4°C). Several specimens of different species, bogue (Boops boops), red mullet (Mullus barbatus), sea bream (Sparus aurata) and sea bass (Dicentrarchus labrax), during storage, under the different temperature conditions adopted, have been examined. The assessed control parameters were physical (texture, through the use of a dynamometer; visual quality using a computer vision system (CVS)), chemical (through footprint metabolomics 1H-NMR) and sensory (Quality Index Method (QIM). Microbiological determinations were also carried out on the species of hake (Merluccius merluccius). In general obtained results confirmed that the temperature of manipulation/conservation is a key factor in maintaining fish freshness. NMR spectroscopy showed to be able to quantify and evaluate the kinetics for unselected compounds during fish degradation, even a posteriori. This can be suitable for the development of new parameters related to quality and freshness. The development of physical methods, particularly the image analysis performed by computer vision system (CVS), for the evaluation of fish degradation, is very promising. Among CVS parameters, skin colour, presence and distribution of gill mucus, and eye shape modification evidenced a high sensibility for the estimation of fish quality loss, as a function of the adopted storage conditions. Particularly the eye concavity index detected on fish eye showed a high positive correlation with total QIM score.
Resumo:
Agri-food supply chains extend beyond national boundaries, partially facilitated by a policy environment that encourages more liberal international trade. Rising concentration within the downstream sector has driven a shift towards “buyer-driven” global value chains (GVCs) extending internationally with global sourcing and the emergence of multinational key economic players that compete with increase emphasis on product quality attributes. Agri-food systems are thus increasingly governed by a range of inter-related public and private standards, both of which are becoming a priori mandatory, especially in supply chains for high-value and quality-differentiated agri-food products and tend to strongly affect upstream agricultural practices, firms’ internal organization and strategic behaviour and to shape the food chain organization. Notably, increasing attention has been given to the impact of SPS measures on agri-food trade and notably on developing countries’ export performance. Food and agricultural trade is the vital link in the mutual dependency of the global trade system and developing countries. Hence, developing countries derive a substantial portion of their income from food and agricultural trade. In Morocco, fruit and vegetable (especially fresh) are the primary agricultural export. Because of the labor intensity, this sector (especially citrus and tomato) is particularly important in terms of income and employment generation, especially for the female laborers hired in the farms and packing houses. Hence, the emergence of agricultural and agrifood product safety issues and the subsequent tightening of market requirements have challenged mutual gains due to the lack of technical and financial capacities of most developing countries.
Resumo:
Proper sample size estimation is an important part of clinical trial methodology and closely related to the precision and power of the trial's results. Trials with sufficient sample sizes are scientifically and ethically justified and more credible compared with trials with insufficient sizes. Planning clinical trials with inadequate sample sizes might be considered as a waste of time and resources, as well as unethical, since patients might be enrolled in a study in which the expected results will not be trusted and are unlikely to have an impact on clinical practice. Because of the low emphasis of sample size calculation in clinical trials in orthodontics, it is the objective of this article to introduce the orthodontic clinician to the importance and the general principles of sample size calculations for randomized controlled trials to serve as guidance for study designs and as a tool for quality assessment when reviewing published clinical trials in our specialty. Examples of calculations are shown for 2-arm parallel trials applicable to orthodontics. The working examples are analyzed, and the implications of design or inherent complexities in each category are discussed.
Resumo:
The objective of this article was to record reporting characteristics related to study quality of research published in major specialty dental journals with the highest impact factor (Journal of Endodontics, Journal of Oral and Maxillofacial Surgery, American Journal of Orthodontics and Dentofacial Orthopedics; Pediatric Dentistry, Journal of Clinical Periodontology, and International Journal of Prosthetic Dentistry). The included articles were classified into the following 3 broad subject categories: (1) cross-sectional (snap-shot), (2) observational, and (3) interventional. Multinomial logistic regression was conducted for effect estimation using the journal as the response and randomization, sample calculation, confounding discussed, multivariate analysis, effect measurement, and confidence intervals as the explanatory variables. The results showed that cross-sectional studies were the dominant design (55%), whereas observational investigations accounted for 13%, and interventions/clinical trials for 32%. Reporting on quality characteristics was low for all variables: random allocation (15%), sample size calculation (7%), confounding issues/possible confounders (38%), effect measurements (16%), and multivariate analysis (21%). Eighty-four percent of the published articles reported a statistically significant main finding and only 13% presented confidence intervals. The Journal of Clinical Periodontology showed the highest probability of including quality characteristics in reporting results among all dental journals.
Resumo:
We test for differences in financial reporting quality between companies that are required to file periodically with the SEC and those that are exempted from filing reports with the SEC under Rule 12g3-2(b). We examine three earnings quality measures: conservatism, abnormal accruals, and the predictability of earnings. Our results, for all three measures, show different financial reporting quality for companies that file with the SEC than for companies exempt from filing requirements. This paper provides empirical evidence of a link between filing with the SEC and financial reporting quality for foreign firms.
Resumo:
For virtually all hospitals, utilization rates are a critical managerial indicator of efficiency and are determined in part by turnover time. Turnover time is defined as the time elapsed between surgeries, during which the operating room is cleaned and preparedfor the next surgery. Lengthier turnover times result in lower utilization rates, thereby hindering hospitals’ ability to maximize the numbers of patients that can be attended to. In this thesis, we analyze operating room data from a two year period provided byEvangelical Community Hospital in Lewisburg, Pennsylvania, to understand the variability of the turnover process. From the recorded data provided, we derive our best estimation of turnover time. Recognizing the importance of being able to properly modelturnover times in order to improve the accuracy of scheduling, we seek to fit distributions to the set of turnover times. We find that log-normal and log-logistic distributions are well-suited to turnover times, although further research must validate this finding. Wepropose that the choice of distribution depends on the hospital and, as a result, a hospital must choose whether to use the log-normal or the log-logistic distribution. Next, we use statistical tests to identify variables that may potentially influence turnover time. We find that there does not appear to be a correlation between surgerytime and turnover time across doctors. However, there are statistically significant differences between the mean turnover times across doctors. The final component of our research entails analyzing and explaining the benefits of introducing control charts as a quality control mechanism for monitoring turnover times in hospitals. Although widely instituted in other industries, control charts are notwidely adopted in healthcare environments, despite their potential benefits. A major component of our work is the development of control charts to monitor the stability of turnover times. These charts can be easily instituted in hospitals to reduce the variabilityof turnover times. Overall, our analysis uses operations research techniques to analyze turnover times and identify manners for improvement in lowering the mean turnover time and thevariability in turnover times. We provide valuable insight into a component of the surgery process that has received little attention, but can significantly affect utilization rates in hospitals. Most critically, an ability to more accurately predict turnover timesand a better understanding of the sources of variability can result in improved scheduling and heightened hospital staff and patient satisfaction. We hope that our findings can apply to many other hospital settings.
Resumo:
Generalized linear mixed models (GLMMs) provide an elegant framework for the analysis of correlated data. Due to the non-closed form of the likelihood, GLMMs are often fit by computational procedures like penalized quasi-likelihood (PQL). Special cases of these models are generalized linear models (GLMs), which are often fit using algorithms like iterative weighted least squares (IWLS). High computational costs and memory space constraints often make it difficult to apply these iterative procedures to data sets with very large number of cases. This paper proposes a computationally efficient strategy based on the Gauss-Seidel algorithm that iteratively fits sub-models of the GLMM to subsetted versions of the data. Additional gains in efficiency are achieved for Poisson models, commonly used in disease mapping problems, because of their special collapsibility property which allows data reduction through summaries. Convergence of the proposed iterative procedure is guaranteed for canonical link functions. The strategy is applied to investigate the relationship between ischemic heart disease, socioeconomic status and age/gender category in New South Wales, Australia, based on outcome data consisting of approximately 33 million records. A simulation study demonstrates the algorithm's reliability in analyzing a data set with 12 million records for a (non-collapsible) logistic regression model.
Resumo:
Submicroscopic changes in chromosomal DNA copy number dosage are common and have been implicated in many heritable diseases and cancers. Recent high-throughput technologies have a resolution that permits the detection of segmental changes in DNA copy number that span thousands of basepairs across the genome. Genome-wide association studies (GWAS) may simultaneously screen for copy number-phenotype and SNP-phenotype associations as part of the analytic strategy. However, genome-wide array analyses are particularly susceptible to batch effects as the logistics of preparing DNA and processing thousands of arrays often involves multiple laboratories and technicians, or changes over calendar time to the reagents and laboratory equipment. Failure to adjust for batch effects can lead to incorrect inference and requires inefficient post-hoc quality control procedures that exclude regions that are associated with batch. Our work extends previous model-based approaches for copy number estimation by explicitly modeling batch effects and using shrinkage to improve locus-specific estimates of copy number uncertainty. Key features of this approach include the use of diallelic genotype calls from experimental data to estimate batch- and locus-specific parameters of background and signal without the requirement of training data. We illustrate these ideas using a study of bipolar disease and a study of chromosome 21 trisomy. The former has batch effects that dominate much of the observed variation in quantile-normalized intensities, while the latter illustrates the robustness of our approach to datasets where as many as 25% of the samples have altered copy number. Locus-specific estimates of copy number can be plotted on the copy-number scale to investigate mosaicism and guide the choice of appropriate downstream approaches for smoothing the copy number as a function of physical position. The software is open source and implemented in the R package CRLMM available at Bioconductor (http:www.bioconductor.org).
Resumo:
OBJECTIVE: To determine the characteristics of asthma (A) and allergic rhinitis (AR) among asthma patients in primary care practice. RESEARCH DESIGN AND METHODS: Primary care physicians, pulmonologists, and allergologists were asked to recruit consecutive asthma patients with or without allergic rhinitis from their daily practice. Cross-sectional data on symptoms, severity, treatment and impact on quality of life of A and AR were recorded and examined using descriptive statistics. Patients with and without AR were then compared. RESULTS: 1244 asthma patients were included by 211 physicians. Asthma was controlled in 19%, partially controlled in 27% and not controlled in 54%. Asthma treatment was generally based on inhaled corticosteroids (ICS) with or without long acting beta 2 agonists (78%). A leukotriene receptor antagonist (LTRA) was used by 46% of the patients. Overall, 950 (76%) asthma patients had AR (A + AR) and 294 (24%) did not (A - AR). Compared to patients with A - AR, A + AR patients were generally younger (mean age +/- standard deviation: 42 +/- 16 vs. 50 +/- 19 years, p < 0.001) and fewer used ICS (75% vs. 88%, p < 0.001). LTRA usage was similar in both groups (46% vs. 48%). Asthma was uncontrolled in 53% of A + AR and 57% of A - AR patients. Allergic rhinitis was treated with a mean of 1.9 specific AR medications: antihistamines (77%), nasal steroids (66%) and/or vasoconstrictors (38%), and/or LTRA (42%). Rhinorrhoea, nasal obstruction, or nasal itching were the most frequently reported AR symptoms and the greatest reported degree of impairment was in daily activities/sports (55%). CONCLUSIONS: Allergic rhinitis was more common among younger asthma patients, increased the burden of symptoms and the need for additional medication but was associated with improved asthma control. However, most asthma patients remained suboptimally controlled regardl-ess of concomitant AR.
Resumo:
OBJECTIVES: To determine the characteristics of popular breast cancer related websites and whether more popular sites are of higher quality. DESIGN: The search engine Google was used to generate a list of websites about breast cancer. Google ranks search results by measures of link popularity---the number of links to a site from other sites. The top 200 sites returned in response to the query "breast cancer" were divided into "more popular" and "less popular" subgroups by three different measures of link popularity: Google rank and number of links reported independently by Google and by AltaVista (another search engine). MAIN OUTCOME MEASURES: Type and quality of content. RESULTS: More popular sites according to Google rank were more likely than less popular ones to contain information on ongoing clinical trials (27% v 12%, P=0.01 ), results of trials (12% v 3%, P=0.02), and opportunities for psychosocial adjustment (48% v 23%, P<0.01). These characteristics were also associated with higher number of links as reported by Google and AltaVista. More popular sites by number of linking sites were also more likely to provide updates on other breast cancer research, information on legislation and advocacy, and a message board service. Measures of quality such as display of authorship, attribution or references, currency of information, and disclosure did not differ between groups. CONCLUSIONS: Popularity of websites is associated with type rather than quality of content. Sites that include content correlated with popularity may best meet the public's desire for information about breast cancer.
Resumo:
Endothelial function typically precedes clinical manifestations of cardiovascular disease and provides a potential mechanism for the associations observed between cardiovascular disease and sleep quality. This study examined how subjective and objective indicators of sleep quality relate to endothelial function, as measured by brachial artery flow-mediated dilation (FMD). In a clinical research centre, 100 non-shift working adults (mean age: 36 years) completed FMD testing and the Pittsburgh Sleep Quality Index, along with a polysomnography assessment to obtain the following measures: slow wave sleep, percentage rapid eye movement (REM) sleep, REM sleep latency, total arousal index, total sleep time, wake after sleep onset, sleep efficiency and apnea-hypopnea index. Bivariate correlations and follow-up multiple regressions examined how FMD related to subjective (i.e., Pittsburgh Sleep Quality Index scores) and objective (i.e., polysomnography-derived) indicators of sleep quality. After FMD showed bivariate correlations with Pittsburgh Sleep Quality Index scores, percentage REM sleep and REM latency, further examination with separate regression models indicated that these associations remained significant after adjustments for sex, age, race, hypertension, body mass index, apnea-hypopnea index, smoking and income (Ps < 0.05). Specifically, as FMD decreased, scores on the Pittsburgh Sleep Quality Index increased (indicating decreased subjective sleep quality) and percentage REM sleep decreased, while REM sleep latency increased (Ps < 0.05). Poorer subjective sleep quality and adverse changes in REM sleep were associated with diminished vasodilation, which could link sleep disturbances to cardiovascular disease.
Resumo:
Purpose Recovery is a critical link between acute reactions to work-stressors and the development of health-impairments in the long run. Even though recovery is particularly necessary when recovery opportunities during work are insufficient, research on recovery during weekends, is still scarce. To fill this gap we tested, whether the inability to psychologically detach from work mediates the effect of social stressors at work on sleep quality on Sunday night. Design/Methodology Sixty full-time employees participated in the study. Daily assessment included diaries on psychological detachment and ambulatory actigraphy to assess psychophysiological indicators of sleep quality. Results Hierarchical regression analyses revealed social stressors at work to be related with psychological detachment and with several sleep quality parameters on Sunday night. Furthermore, psychological detachment from work mediated the effect of social stressors at work on sleep quality. Limitations Methodological considerations regarding the use of actigraphy data should be taken into account. Research/Practical Implications Our results show that social stressors at work may lower resources just before people get started into the new working week. Originality/Value This is the first study to show that social stressors at work are an antecedent of psychological detachment on Sunday evening and of objective sleep quality on Sunday.
Resumo:
Instruments for on-farm determination of colostrum quality such as refractometers and densimeters are increasingly used in dairy farms. The colour of colostrum is also supposed to reflect its quality. A paler or mature milk-like colour is associated with a lower colostrum value in terms of its general composition compared with a more yellowish and darker colour. The objective of this study was to investigate the relationships between colour measurement of colostrum using the CIELAB colour space (CIE L*=from white to black, a*=from red to green, b*=from yellow to blue, chroma value G=visual perceived colourfulness) and its composition. Dairy cow colostrum samples (n=117) obtained at 4·7±1·5 h after parturition were analysed for immunoglobulin G (IgG) by ELISA and for fat, protein and lactose by infrared spectroscopy. For colour measurements, a calibrated spectrophotometer was used. At a cut-off value of 50 mg IgG/ml, colour measurement had a sensitivity of 50·0%, a specificity of 49·5%, and a negative predictive value of 87·9%. Colostral IgG concentration was not correlated with the chroma value G, but with relative lightness L*. While milk fat content showed a relationship to the parameters L*, a*, b* and G from the colour measurement, milk protein content was not correlated with a*, but with L*, b*, and G. Lactose concentration in colostrum showed only a relationship with b* and G. In conclusion, parameters of the colour measurement showed clear relationships to colostral IgG, fat, protein and lactose concentration in dairy cows. Implementation of colour measuring devices in automatic milking systems and milking parlours might be a potential instrument to access colostrum quality as well as detecting abnormal milk.