967 resultados para Robust methods
Resumo:
We examined drivers of article citations using 776 articles that were published from 1990-2012 in a broad-based and high-impact social sciences journal, The Leadership Quarterly. These articles had 1,191 unique authors having published and received in total (at the time of their most recent article published in our dataset) 16,817 articles and 284,777 citations, respectively. Our models explained 66.6% of the variance in citations and showed that quantitative, review, method, and theory articles were significantly more cited than were qualitative articles or agent-based simulations. As concerns quantitative articles, which constituted the majority of the sample, our model explained 80.3% of the variance in citations; some methods (e.g., use of SEM) and designs (e.g., meta-analysis), as well as theoretical approaches (e.g., use of transformational, charismatic, or visionary type-leadership theories) predicted higher article citations. Regarding the statistical conclusion validity of quantitative articles, articles having endogeneity threats received significantly fewer citations than did those using a more robust design or an estimation procedure that ensured correct causal estimation. We make several general recommendations on how to improve research practice and article citations.
Resumo:
Painful neuromas may follow traumatic nerve injury. We carried out a double-blind controlled trial in which patients with a painful neuroma of the lower limb (n = 20) were randomly assigned to treatment by resection of the neuroma and translocation of the proximal nerve stump into either muscle tissue or an adjacent subcutaneous vein. Translocation into a vein led to reduced intensity of pain as assessed by visual analogue scale (5.8 (SD 2.7) vs 3.8 (SD 2.4); p < 0.01), and improved sensory, affective and evaluative dimensions of pain as assessed by the McGill pain score (33 (SD 18) vs 14 (SD 12); p < 0.01). This was associated with an increased level of activity (p < 0.01) and improved function (p < 0.01). Transposition of the nerve stump into an adjacent vein should be preferred to relocation into muscle.
Resumo:
Objectives. The goal of this study is to evaluate a T2-mapping sequence by: (i) measuring the reproducibility intra- and inter-observer variability in healthy volunteers in two separate scanning session with a T2 reference phantom; (2) measuring the mean T2 relaxation times by T2-mapping in infarcted myocardium in patients with subacute MI and compare it with patient's the gold standard X-ray coronary angiography and healthy volunteers results. Background. Myocardial edema is a consequence of an inflammation of the tissue, as seen in myocardial infarct (MI). It can be visualized by cardiovascular magnetic resonance (CMR) imaging using the T2 relaxation time. T2-mapping is a quantitative methodology that has the potential to address the limitation of the conventional T2-weighted (T2W) imaging. Methods. The T2-mapping protocol used for all MRI scans consisted in a radial gradient echo acquisition with a lung-liver navigator for free-breathing acquisition and affine image registration. Mid-basal short axis slices were acquired.T2-maps analyses: 2 observers semi- automatically segmented the left ventricle in 6 segments accordingly to the AHA standards. 8 healthy volunteers (age: 27 ± 4 years; 62.5% male) were scanned in 2 separate sessions. 17 patients (age : 61.9 ± 13.9 years; 82.4% male) with subacute STEMI (70.6%) and NSTEMI underwent a T2-mapping scanning session. Results. In healthy volunteers, the mean inter- and intra-observer variability over the entire short axis slice (segment 1 to 6) was 0.1 ms (95% confidence interval (CI): -0.4 to 0.5, p = 0.62) and 0.2 ms (95% CI: -2.8 to 3.2, p = 0.94, respectively. T2 relaxation time measurements with and without the correction of the phantom yielded an average difference of 3.0 ± 1.1 % and 3.1 ± 2.1 % (p = 0.828), respectively. In patients, the inter-observer variability in the entire short axis slice (S1-S6), was 0.3 ms (95% CI: -1.8 to 2.4, p = 0.85). Edema location as determined through the T2-mapping and the coronary artery occlusion as determined on X-ray coronary angiography correlated in 78.6%, but only in 60% in apical infarcts. All except one of the maximal T2 values in infarct patients were greater than the upper limit of the 95% confidence interval for normal myocardium. Conclusions. The T2-mapping methodology is accurate in detecting infarcted, i.e. edematous tissue in patients with subacute infarcts. This study further demonstrated that this T2-mapping technique is reproducible and robust enough to be used on a segmental basis for edema detection without the need of a phantom to yield a T2 correction factor. This new quantitative T2-mapping technique is promising and is likely to allow for serial follow-up studies in patients to improve our knowledge on infarct pathophysiology, on infarct healing, and for the assessment of novel treatment strategies for acute infarctions.
Resumo:
We propose a segmentation method based on the geometric representation of images as 2-D manifolds embedded in a higher dimensional space. The segmentation is formulated as a minimization problem, where the contours are described by a level set function and the objective functional corresponds to the surface of the image manifold. In this geometric framework, both data-fidelity and regularity terms of the segmentation are represented by a single functional that intrinsically aligns the gradients of the level set function with the gradients of the image and results in a segmentation criterion that exploits the directional information of image gradients to overcome image inhomogeneities and fragmented contours. The proposed formulation combines this robust alignment of gradients with attractive properties of previous methods developed in the same geometric framework: 1) the natural coupling of image channels proposed for anisotropic diffusion and 2) the ability of subjective surfaces to detect weak edges and close fragmented boundaries. The potential of such a geometric approach lies in the general definition of Riemannian manifolds, which naturally generalizes existing segmentation methods (the geodesic active contours, the active contours without edges, and the robust edge integrator) to higher dimensional spaces, non-flat images, and feature spaces. Our experiments show that the proposed technique improves the segmentation of multi-channel images, images subject to inhomogeneities, and images characterized by geometric structures like ridges or valleys.
Resumo:
Objective: To determine the values of, and study the relationships among, central corneal thickness (CCT), intraocular pressure (IOP), and degree of myopia (DM) in an adult myopic population aged 20 to 40 years in Almeria (southeast Spain). To our knowledge this is first study of this kind in this region. Methods: An observational, descriptive, cross-sectional study was done in which a sample of 310 myopic patients (620 eyes) aged 20 to 40 years was selected by gender- and age-stratified sampling, which was proportionally fixed to the size of the population strata for which a 20% prevalence of myopia, 5% epsilon, and a 95% confidence interval were hypothesized. We studied IOP, CCT, and DM and their relationships by calculating the mean, standard deviation, 95% confidence interval for the mean, median, Fisher’s asymmetry coefficient, range (maximum, minimum), and the Brown-Forsythe’s robust test for each variable (IOP, CCT, and DM). Results: In the adult myopic population of Almeria aged 20 to 40 years (mean of 29.8), the mean overall CCT was 550.12 μm. The corneas of men were thicker than those of women (P = 0.014). CCT was stable as no significant differences were seen in the 20- to 40-year-old subjects’ CCT values. The mean overall IOP was 13.60 mmHg. Men had a higher IOP than women (P = 0.002). Subjects over 30 years (13.83) had a higher IOP than those under 30 (13.38) (P = 0.04). The mean overall DM was −4.18 diopters. Men had less myopia than women (P < 0.001). Myopia was stable in the 20- to 40-year-old study population (P = 0.089). A linear relationship was found between CCT and IOP (R2 = 0.152, P ≤ 0.001). CCT influenced the IOP value by 15.2%. However no linear relationship between DM and IOP, or between CCT and DM, was found. Conclusions: CCT was found to be similar to that reported in other studies in different populations. IOP tends to increase after the age of 30 and is not accounted for by alterations in CCT values.
Resumo:
Pounamu (NZ jade), or nephrite, is a protected mineral in its natural form following thetransfer of ownership back to Ngai Tahu under the Ngai Tahu (Pounamu Vesting) Act 1997.Any theft of nephrite is prosecutable under the Crimes Act 1961. Scientific evidence isessential in cases where origin is disputed. A robust method for discrimination of thismaterial through the use of elemental analysis and compositional data analysis is required.Initial studies have characterised the variability within a given nephrite source. This hasincluded investigation of both in situ outcrops and alluvial material. Methods for thediscrimination of two geographically close nephrite sources are being developed.Key Words: forensic, jade, nephrite, laser ablation, inductively coupled plasma massspectrometry, multivariate analysis, elemental analysis, compositional data analysis
Resumo:
Immobile location-allocation (LA) problems is a type of LA problem that consists in determining the service each facility should offer in order to optimize some criterion (like the global demand), given the positions of the facilities and the customers. Due to the complexity of the problem, i.e. it is a combinatorial problem (where is the number of possible services and the number of facilities) with a non-convex search space with several sub-optimums, traditional methods cannot be applied directly to optimize this problem. Thus we proposed the use of clustering analysis to convert the initial problem into several smaller sub-problems. By this way, we presented and analyzed the suitability of some clustering methods to partition the commented LA problem. Then we explored the use of some metaheuristic techniques such as genetic algorithms, simulated annealing or cuckoo search in order to solve the sub-problems after the clustering analysis
Resumo:
Endometriosis is an inflammatory estrogen-dependent disease defined by the presence of endometrial glands and stroma at extrauterine sites. The main purpose of endometriosis management is alleviating pain associated to the disease. This can be achieved surgically or medically, although in most women a combination of both treatments is required. Long-term medical treatment is usually needed in most women. Unfortunately, in most cases, pain symptoms recur between 6 months and 12 months once treatment is stopped. The authors conducted a literature search for English original articles, related to new medical treatments of endometriosis in humans, including articles published in PubMed, Medline, and the Cochrane Library. Keywords included "endometriosis" matched with "medical treatment", "new treatment", "GnRH antagonists", "Aromatase inhibitors", "selective progesterone receptor modulators", "anti-TNF α", and "anti-angiogenic factors". Hormonal treatments currently available are effective in the relief of pain associated to endometriosis. Among new hormonal drugs, association to aromatase inhibitors could be effective in the treatment of women who do not respond to conventional therapies. GnRH antagonists are expected to be as effective as GnRH agonists, but with easier administration (oral). There is a need to find effective treatments that do not block the ovarian function. For this purpose, antiangiogenic factors could be important components of endometriosis therapy in the future. Upcoming researches and controlled clinical trials should focus on these drugs.
Resumo:
The objective the present research is try to find some control design strategies, which must be effective and closed to the real operation conditions. As a novel contribution to structural control strategies, the theories of Interval Modal Arithmetic, Backstepping Control and QFT (Qualitative Feedback Theory) will be studied. The steps to follow are to develop first new controllers based on the above theories and then to implement the proposed control strategies to different kind of structures. The report is organized as follows. The Chapter 2 presents the state-of-the-art on structural control systems. The chapter 3 presents the most important open problems found in field of structural control. The exploratory work made by the author, research proposal and working plan are given in the Chapter 4
Resumo:
This book gives a general view of sequence analysis, the statistical study of successions of states or events. It includes innovative contributions on life course studies, transitions into and out of employment, contemporaneous and historical careers, and political trajectories. The approach presented in this book is now central to the life-course perspective and the study of social processes more generally. This volume promotes the dialogue between approaches to sequence analysis that developed separately, within traditions contrasted in space and disciplines. It includes the latest developments in sequential concepts, coding, atypical datasets and time patterns, optimal matching and alternative algorithms, survey optimization, and visualization. Field studies include original sequential material related to parenting in 19th-century Belgium, higher education and work in Finland and Italy, family formation before and after German reunification, French Jews persecuted in occupied France, long-term trends in electoral participation, and regime democratization. Overall the book reassesses the classical uses of sequences and it promotes new ways of collecting, formatting, representing and processing them. The introduction provides basic sequential concepts and tools, as well as a history of the method. Chapters are presented in a way that is both accessible to the beginner and informative to the expert.
Resumo:
OBJECTIVE: Postmortem investigations are becoming more and more sophisticated. CT and MRI are already being used in pathology and forensic medicine. In this context, the impact of postmortem angiography increases because of the rapid evaluation of organ-specific vascular patterns, vascular alteration under pathologic and physiologic conditions, and tissue changes induced by artificial and unnatural causes. CONCLUSION: In this article, the advantages and disadvantages of former and current techniques and contrast agents are reviewed.
Resumo:
OBJECTIVE: To assess total free-living energy expenditure (EE) in Gambian farmers with two independent methods, and to determine the most realistic free-living EE and physical activity in order to establish energy requirements for rural populations in developing countries. DESIGN: In this cross-sectional study two methods were applied at the same time. SETTING: Three rural villages and Dunn Nutrition Centre Keneba, MRC, The Gambia. SUBJECTS: Eight healthy, male subjects were recruited from three rural Gambian villages in the sub-Sahelian area (age: 25 +/- 4y; weight: 61.2 +/- 10.1 kg; height: 169.5 +/- 6.5 cm, body mass index: 21.2 +/- 2.5 kg/m2). INTERVENTION: We assessed free-living EE with two inconspicuous and independent methods: the first one used doubly labeled water (DLW) (2H2 18O) over a period of 12 days, whereas the second one was based on continuous heart rate (HR) measurements on two to three days using individual regression lines (HR vs EE) established by indirect calorimetry in a respiration chamber. Isotopic dilution of deuterium (2H2O) was also used to assess total body water and hence fat-free mass (FFM). RESULTS: EE assessed by DLW was found to be 3880 +/- 994 kcal/day (16.2 +/- 4.2 MJ/day). Expressed per unit body weight the EE averaged 64.2 +/- 9.3 kcal/kg/d (269 +/- 38 kJ/kg/d). These results were consistent with the EE results assessed by HR: 3847 +/- 605 kcal/d (16.1 +/- 2.5 MJ/d) or 63.4 +/- 8.2 kcal/kg/d (265 +/- 34kJ/kg/d). Physical activity index, expressed as a multiple of basal metabolic rate (BMR), averaged 2.40 +/- 0.41 (DLW) or 2.40 +/- 0.28 (HR). CONCLUSIONS: These findings suggest an extremely high level of physical activity in Gambian men during intense agricultural work (wet season). This contrasts with the relative food shortage, previously reported during the harvesting period. We conclude that the assessment of EE during the agricultural season in non-industrialized countries needs further investigations in order to obtain information on the energy requirement of these populations. For this purpose the use of the DLW and HR methods have been shown to be useful and complementary.
Resumo:
International conservation organisations have identified priority areas for biodiversity conservation. These global-scale prioritisations affect the distribution of funds for conservation interventions. As each organisation has a different focus, each prioritisation scheme is determined by different decision criteria and the resultant priority areas vary considerably. However, little is known about how the priority areas will respond to the impacts of climate change. In this paper, we examined the robustness of eight global-scale prioritisations to climate change under various climate predictions from seven global circulation models. We developed a novel metric of the climate stability for 803 ecoregions based on a recently introduced method to estimate the overlap of climate envelopes. The relationships between the decision criteria and the robustness of the global prioritisation schemes were statistically examined. We found that decision criteria related to level of endemism and landscape fragmentation were strongly correlated with areas predicted to be robust to a changing climate. Hence, policies that prioritise intact areas due to the likely cost efficiency, and assumptions related to the potential to mitigate the impacts of climate change, require further examination. Our findings will help determine where additional management is required to enable biodiversity to adapt to the impacts of climate change
Resumo:
Customer satisfaction and retention are key issues for organizations in today’s competitive market place. As such, much research and revenue has been invested in developing accurate ways of assessing consumer satisfaction at both the macro (national) and micro (organizational) level, facilitating comparisons in performance both within and between industries. Since the instigation of the national customer satisfaction indices (CSI), partial least squares (PLS) has been used to estimate the CSI models in preference to structural equation models (SEM) because they do not rely on strict assumptions about the data. However, this choice was based upon some misconceptions about the use of SEM’s and does not take into consideration more recent advances in SEM, including estimation methods that are robust to non-normality and missing data. In this paper, both SEM and PLS approaches were compared by evaluating perceptions of the Isle of Man Post Office Products and Customer service using a CSI format. The new robust SEM procedures were found to be advantageous over PLS. Product quality was found to be the only driver of customer satisfaction, while image and satisfaction were the only predictors of loyalty, thus arguing for the specificity of postal services
Resumo:
This paper presents and discusses the use of Bayesian procedures - introduced through the use of Bayesian networks in Part I of this series of papers - for 'learning' probabilities from data. The discussion will relate to a set of real data on characteristics of black toners commonly used in printing and copying devices. Particular attention is drawn to the incorporation of the proposed procedures as an integral part in probabilistic inference schemes (notably in the form of Bayesian networks) that are intended to address uncertainties related to particular propositions of interest (e.g., whether or not a sample originates from a particular source). The conceptual tenets of the proposed methodologies are presented along with aspects of their practical implementation using currently available Bayesian network software.