958 resultados para consensus methods
Resumo:
The greatest effect on reducing mortality in breast cancer comes from the detection and treatment of invasive cancer when it is as small as possible. Although mammography screening is known to be effective, observer errors are frequent and false-negative cancers can be found in retrospective studies of prior mammograms. In the year 2001, 67 women with 69 surgically proven cancers detected at screening in the Mammography Centre of Helsinki University Hospital had previous mammograms as well. These mammograms were analyzed by an experienced screening radiologist, who found that 36 lesions were already visible in previous screening rounds. CAD (Second Look v. 4.01) detected 23 of these missed lesions. Eight readers with different kinds of experience with mammography screening read the films of 200 women with and without CAD. These films included 35 of those missed lesions and 16 screen-detected cancers. CAD sensitivity was 70.6% and specificity 15.8%. Use of CAD lengthened the mean time spent for readings but did not significantly affect readers sensitivities or specificities. Therefore the use of applied version of CAD (Second Look v. 4.01) is questionable. Because none of those eight readers found exactly same cancers, two reading methods were compared: summarized independent reading (at least a single cancer-positive opinion within the group considered decisive) and conference consensus reading (the cancer-positive opinion of the reader majority was considered decisive). The greatest sensitivity of 74.5% was achieved when the independent readings of 4 best-performing readers were summarized. Overall the summarized independent readings were more sensitive than conference consensus readings (64.7% vs. 43.1%) while there was far less difference in mean specificities (92.4% vs. 97.7%). After detecting suspicious lesion, the radiologist has to decide what is the most accurate, fast, and cost-effective means of further work-up. The feasibility of FNAC and CNB in the diagnosis of breast lesions was compared in non-randomised, retrospective study of 580 (503 malignant) breast lesions of 572 patients. The absolute sensitivity for CNB was better than for FNAC, 96% (206/214) vs. 67% (194/289) (p < 0.0001). An additional needle biopsy or surgical biopsy was performed for 93 and 62 patients with FNAC, but for only 2 and 33 patients with CNB. The frequent need of supplement biopsies and unnecessary axillary operations due to false-positive findings made FNAC (294 ) more expensive than CNB (223 ), and because the advantage of quick analysis vanishes during the overall diagnostic and referral process, it is recommendable to use CNB as initial biopsy method.
Resumo:
An initial call by the editors of International Research in Geographical and Environmental Education (IRGEE) prompted a study about the inclusion of geography in the Trends in International Mathematics and Science Study (TIMSS) tests. This study found that the geography education community were overwhelmingly in favour of such a move, believing that the information collected would be valuable in enhancing learning outcomes through its impact on research, policy and teaching practice (Lane & Bourke, 2016). However, a number of questions about the development and implementation of this assessment were posed. This paper addresses two of these questions: (1) What is the global geographical education community’s views about Grades 4 and 8 as target year levels for the assessment?; and, (2) What types of knowledge and cognitive dimensions would they like to see assessed? Based on these findings, the overarching key question that requires further discussion is: Can there be some degree of consensus in terms of what should be assessed and how the test should be implemented?
Resumo:
Idiopathic pulmonary fibrosis (IPF) is an interstitial lung disease with unknown aetiology and poor prognosis. IPF is characterized by alveolar epithelial damage that leads tissue remodelling and ultimately to the loss of normal lung architecture and function. Treatment has been focused on anti-inflammatory therapies, but due to their poor efficacy new therapeutic modalities are being sought. There is a need for early diagnosis and also for differential diagnostic markers for IPF and other interstitial lung diseases. The study utilized patient material obtained from bronchoalveolar lavage (BAL), diagnostic biopsies or lung transplantation. Human pulmonary fibroblast cell cultures were propagated and asbestos-induced pulmonary fibrosis in mice was used as an experimental animal model of IPF. The possible markers for IPF were scanned by immunohistochemistry, RT-PCR, ELISA and western blot. Matrix metalloproteinases (MMPs) are proteolytic enzymes that participate in tissue remodelling. Microarray studies have introduced potential markers that could serve as additional tools for the assessment of IPF and one of the most promising was MMP 7. MMP-7 protein levels were measured in the BAL fluid of patients with idiopathic interstitial lung diseases or idiopathic cough. MMP-7 was however similarly elevated in the BAL fluid of all these disorders and thus cannot be used as a differential diagnostic marker for IPF. Activation of transforming growth factor (TGF)-ß is considered to be a key element in the progression of IPF. Bone morphogenetic proteins (BMP) are negative regulators of intracellular TGF-ß signalling and BMP-4 signalling is in turn negatively regulated by gremlin. Gremlin was found to be highly upregulated in the IPF lungs and IPF fibroblasts. Gremlin was detected in the thickened IPF parenchyma and endothelium of small capillaries, whereas in non-specific interstitial pneumonia it localized predominantly in the alveolar epithelium. Parenchymal gremlin immunoreactivity might indicate IPF-type interstitial pneumonia. Gremlin mRNA levels were higher in patients with end-stage fibrosis suggesting that gremlin might be a marker for more advanced disease. Characterization of the fibroblastic foci in the IPF lungs showed that immunoreactivity to platelet-derived growth factor (PDGF) receptor-α and PDGF receptor-β was elevated in IPF parenchyma, but the fibroblastic foci showed only minor immunoreactivity to the PDGF receptors or the antioxidant peroxiredoxin II. Ki67 positive cells were also observed predominantly outside the fibroblastic foci, suggesting that the fibroblastic foci may not be composed of actively proliferating cells. When inhibition of profibrotic PDGF-signalling by imatinib mesylate was assessed, imatinib mesylate reduced asbestos-induced pulmonary fibrosis in mice as well as human pulmonary fibroblast migration in vitro but it had no effect on the lung inflammation.
Resumo:
This study is part of the joint project "The Genetic Epidemiology and Molecular Genetics of schizophrenia in Finland" between the Departments of Mental Health and Alcohol Research, and Molecular Medicine at the National Public Health Institute. In the study, we utilized three nationwide health care registers: 1) the Hospital Discharge Register, 2) the Free Medication Register, and 3) the Disability Pension Register, plus the National Population Register, in order to identify all patients with schizophrenia born from 1940 to 1976 (N=33,731) in Finland, and their first degree-relatives. 658 patients with at least one parent born in a homogeneous isolate in northeastern Finland were identified, as well as 4904 familial schizophrenia patients with at least two affected siblings from the whole country. The comparison group was derived from the Health 2000 Study. We collected case records and reassessed the register diagnosis. Were contacted the isolate patients and a random sample of patients from the whole country to make diagnostic clinical interviews and to assess the negative and positive symptoms and signs of schizophrenia. In addition to these patients, we interviewed siblings who were initially healthy according to the Hospital Discharge Register. Of those with a register diagnosis of schizophrenia, schizoaffective or schizophreniform disorder, 69% received a record-based consensus diagnosis and 63% an interview-based diagnosis of schizophrenia. Patients with schizophrenia having first-degree relatives with psychotic disorder had more severe affective flattening and alogia than those who were the only affected individuals in their family. The novel findings were: 1) The prevalence of schizophrenia in the isolate was relatively high based on register (1.5%), case record (0.9-1.3%), and interview (0.7-1.2%) data. 2) Isolate patients, regardless of their familial loading for schizophrenia, had less delusions and hallucinations than the whole country familial patients, which may be related to the genetic homogeneity in the isolate. This phenotype encourages the use of endophenotypes in genetic analyses instead of diagnoses alone. 3) The absence of register diagnosis does not confirm that siblings are healthy, because 7.7% of siblings had psychotic symptoms already before the register diagnoses were identified in 1991. For genetic research, the register diagnosis should therefore be reassessed using either a structured interview or a best- estimate case note consensus diagnosis. Structured clinical interview methods need be considered also in clinical practice.
Resumo:
Objectives In China, “serious road traffic crashes” (SRTCs) are those in which there are 10-30 fatalities, 50-100 serious injuries or a total cost of 50-100 million RMB ($US8-16m), and “particularly serious road traffic crashes” (PSRTCs) are those which are more severe or costly. Due to the large number of fatalities and injuries as well as the negative public reaction they elicit, SRTCs and PSRTCs have become great concerns to China during recent years. The aim of this study is to identify the main factors contributing to these road traffic crashes and to propose preventive measures to reduce their number. Methods 49 contributing factors of the SRTCs and PSRTCs that occurred from 2007 to 2013 were collected from the database “In-depth Investigation and Analysis System for Major Road traffic crashes” (IIASMRTC) and were analyzed through the integrated use of principal component analysis and hierarchical clustering to determine the primary and secondary groups of contributing factors. Results Speeding and overloading of passengers were the primary contributing factors, featuring in up to 66.3% and 32.6% of accidents respectively. Two secondary contributing factors were road-related: lack of or nonstandard roadside safety infrastructure, and slippery roads due to rain, snow or ice. Conclusions The current approach to SRTCs and PSRTCs is focused on the attribution of responsibility and the enforcement of regulations considered relevant to particular SRTCs and PSRTCs. It would be more effective to investigate contributing factors and characteristics of SRTCs and PSRTCs as a whole, to provide adequate information for safety interventions in regions where SRTCs and PSRTCs are more common. In addition to mandating of a driver training program and publicisation of the hazards associated with traffic violations, implementation of speed cameras, speed signs, markings and vehicle-mounted GPS are suggested to reduce speeding of passenger vehicles, while increasing regular checks by traffic police and passenger station staff, and improving transportation management to increase income of contractors and drivers are feasible measures to prevent overloading of people. Other promising measures include regular inspection of roadside safety infrastructure, and improving skid resistance on dangerous road sections in mountainous areas.
Resumo:
Visual content is a critical component of everyday social media, on platforms explicitly framed around the visual (Instagram and Vine), on those offering a mix of text and images in myriad forms (Facebook, Twitter, and Tumblr), and in apps and profiles where visual presentation and provision of information are important considerations. However, despite being so prominent in forms such as selfies, looping media, infographics, memes, online videos, and more, sociocultural research into the visual as a central component of online communication has lagged behind the analysis of popular, predominantly text-driven social media. This paper underlines the increasing importance of visual elements to digital, social, and mobile media within everyday life, addressing the significant research gap in methods for tracking, analysing, and understanding visual social media as both image-based and intertextual content. In this paper, we build on our previous methodological considerations of Instagram in isolation to examine further questions, challenges, and benefits of studying visual social media more broadly, including methodological and ethical considerations. Our discussion is intended as a rallying cry and provocation for further research into visual (and textual and mixed) social media content, practices, and cultures, mindful of both the specificities of each form, but also, and importantly, the ongoing dialogues and interrelations between them as communication forms.
Resumo:
Modern-day weather forecasting is highly dependent on Numerical Weather Prediction (NWP) models as the main data source. The evolving state of the atmosphere with time can be numerically predicted by solving a set of hydrodynamic equations, if the initial state is known. However, such a modelling approach always contains approximations that by and large depend on the purpose of use and resolution of the models. Present-day NWP systems operate with horizontal model resolutions in the range from about 40 km to 10 km. Recently, the aim has been to reach operationally to scales of 1 4 km. This requires less approximations in the model equations, more complex treatment of physical processes and, furthermore, more computing power. This thesis concentrates on the physical parameterization methods used in high-resolution NWP models. The main emphasis is on the validation of the grid-size-dependent convection parameterization in the High Resolution Limited Area Model (HIRLAM) and on a comprehensive intercomparison of radiative-flux parameterizations. In addition, the problems related to wind prediction near the coastline are addressed with high-resolution meso-scale models. The grid-size-dependent convection parameterization is clearly beneficial for NWP models operating with a dense grid. Results show that the current convection scheme in HIRLAM is still applicable down to a 5.6 km grid size. However, with further improved model resolution, the tendency of the model to overestimate strong precipitation intensities increases in all the experiment runs. For the clear-sky longwave radiation parameterization, schemes used in NWP-models provide much better results in comparison with simple empirical schemes. On the other hand, for the shortwave part of the spectrum, the empirical schemes are more competitive for producing fairly accurate surface fluxes. Overall, even the complex radiation parameterization schemes used in NWP-models seem to be slightly too transparent for both long- and shortwave radiation in clear-sky conditions. For cloudy conditions, simple cloud correction functions are tested. In case of longwave radiation, the empirical cloud correction methods provide rather accurate results, whereas for shortwave radiation the benefit is only marginal. Idealised high-resolution two-dimensional meso-scale model experiments suggest that the reason for the observed formation of the afternoon low level jet (LLJ) over the Gulf of Finland is an inertial oscillation mechanism, when the large-scale flow is from the south-east or west directions. The LLJ is further enhanced by the sea-breeze circulation. A three-dimensional HIRLAM experiment, with a 7.7 km grid size, is able to generate a similar LLJ flow structure as suggested by the 2D-experiments and observations. It is also pointed out that improved model resolution does not necessary lead to better wind forecasts in the statistical sense. In nested systems, the quality of the large-scale host model is really important, especially if the inner meso-scale model domain is small.
Resumo:
Purpose The research purpose was to identify both the inspiration sources used by fast fashion designers and ways the designers sort information from the sources during the product development process. Design/methodology/approach This is a qualitative study, drawing on semi-structured interviews conducted with the members of the in-house design teams of three Australian fast fashion companies. Findings Australian fast fashion designers rely on a combination of trend data, sales data, product analysis and travel for design development ideas. The designers then use the consensus and embodiment methods to interpret and synthesise information from those inspiration sources. Research limitations/implications The empirical data used in the analysis were limited by interviewing fashion designers within only three Australian companies. Originality/value This research augments knowledge of fast fashion product development, in particular designers’ methods and approaches to product design within a volatile and competitive market.
Resumo:
An efficient and statistically robust solution for the identification of asteroids among numerous sets of astrometry is presented. In particular, numerical methods have been developed for the short-term identification of asteroids at discovery, and for the long-term identification of scarcely observed asteroids over apparitions, a task which has been lacking a robust method until now. The methods are based on the solid foundation of statistical orbital inversion properly taking into account the observational uncertainties, which allows for the detection of practically all correct identifications. Through the use of dimensionality-reduction techniques and efficient data structures, the exact methods have a loglinear, that is, O(nlog(n)), computational complexity, where n is the number of included observation sets. The methods developed are thus suitable for future large-scale surveys which anticipate a substantial increase in the astrometric data rate. Due to the discontinuous nature of asteroid astrometry, separate sets of astrometry must be linked to a common asteroid from the very first discovery detections onwards. The reason for the discontinuity in the observed positions is the rotation of the observer with the Earth as well as the motion of the asteroid and the observer about the Sun. Therefore, the aim of identification is to find a set of orbital elements that reproduce the observed positions with residuals similar to the inevitable observational uncertainty. Unless the astrometric observation sets are linked, the corresponding asteroid is eventually lost as the uncertainty of the predicted positions grows too large to allow successful follow-up. Whereas the presented identification theory and the numerical comparison algorithm are generally applicable, that is, also in fields other than astronomy (e.g., in the identification of space debris), the numerical methods developed for asteroid identification can immediately be applied to all objects on heliocentric orbits with negligible effects due to non-gravitational forces in the time frame of the analysis. The methods developed have been successfully applied to various identification problems. Simulations have shown that the methods developed are able to find virtually all correct linkages despite challenges such as numerous scarce observation sets, astrometric uncertainty, numerous objects confined to a limited region on the celestial sphere, long linking intervals, and substantial parallaxes. Tens of previously unknown main-belt asteroids have been identified with the short-term method in a preliminary study to locate asteroids among numerous unidentified sets of single-night astrometry of moving objects, and scarce astrometry obtained nearly simultaneously with Earth-based and space-based telescopes has been successfully linked despite a substantial parallax. Using the long-term method, thousands of realistic 3-linkages typically spanning several apparitions have so far been found among designated observation sets each spanning less than 48 hours.
Resumo:
"We thank MrGilder for his considered comments and suggestions for alternative analyses of our data. We also appreciate Mr Gilder’s support of our call for larger studies to contribute to the evidence base for preoperative loading with high-carbohydrate fluids..."
Resumo:
To complement the existing treatment guidelines for all tumour types, ESMO organises consensus conferences to focus on specific issues in each type of tumour. The 2nd ESMO Consensus Conference on Lung Cancer was held on 11–12 May 2013 in Lugano. A total of 35 experts met to address several questions on non-small-cell lung cancer (NSCLC) in each of four areas: pathology and molecular biomarkers, first-line/second and further lines of treatment in advanced disease, early-stage disease and locally advanced disease. For each question, recommendations were made including reference to the grade of recommendation and level of evidence. This consensus paper focuses on locally advanced disease.
Resumo:
Foliage density and leaf area index are important vegetation structure variables. They can be measured by several methods but few have been tested in tropical forests which have high structural heterogeneity. In this study, foliage density estimates by two indirect methods, the point quadrat and photographic methods, were compared with those obtained by direct leaf counts in the understorey of a wet evergreen forest in southern India. The point quadrat method has a tendency to overestimate, whereas the photographic method consistently and ignificantly underestimates foliage density. There was stratification within the understorey, with areas close to the ground having higher foliage densities.