959 resultados para Stated preference methods
Resumo:
Idiopathic pulmonary fibrosis (IPF) is an interstitial lung disease with unknown aetiology and poor prognosis. IPF is characterized by alveolar epithelial damage that leads tissue remodelling and ultimately to the loss of normal lung architecture and function. Treatment has been focused on anti-inflammatory therapies, but due to their poor efficacy new therapeutic modalities are being sought. There is a need for early diagnosis and also for differential diagnostic markers for IPF and other interstitial lung diseases. The study utilized patient material obtained from bronchoalveolar lavage (BAL), diagnostic biopsies or lung transplantation. Human pulmonary fibroblast cell cultures were propagated and asbestos-induced pulmonary fibrosis in mice was used as an experimental animal model of IPF. The possible markers for IPF were scanned by immunohistochemistry, RT-PCR, ELISA and western blot. Matrix metalloproteinases (MMPs) are proteolytic enzymes that participate in tissue remodelling. Microarray studies have introduced potential markers that could serve as additional tools for the assessment of IPF and one of the most promising was MMP 7. MMP-7 protein levels were measured in the BAL fluid of patients with idiopathic interstitial lung diseases or idiopathic cough. MMP-7 was however similarly elevated in the BAL fluid of all these disorders and thus cannot be used as a differential diagnostic marker for IPF. Activation of transforming growth factor (TGF)-ß is considered to be a key element in the progression of IPF. Bone morphogenetic proteins (BMP) are negative regulators of intracellular TGF-ß signalling and BMP-4 signalling is in turn negatively regulated by gremlin. Gremlin was found to be highly upregulated in the IPF lungs and IPF fibroblasts. Gremlin was detected in the thickened IPF parenchyma and endothelium of small capillaries, whereas in non-specific interstitial pneumonia it localized predominantly in the alveolar epithelium. Parenchymal gremlin immunoreactivity might indicate IPF-type interstitial pneumonia. Gremlin mRNA levels were higher in patients with end-stage fibrosis suggesting that gremlin might be a marker for more advanced disease. Characterization of the fibroblastic foci in the IPF lungs showed that immunoreactivity to platelet-derived growth factor (PDGF) receptor-α and PDGF receptor-β was elevated in IPF parenchyma, but the fibroblastic foci showed only minor immunoreactivity to the PDGF receptors or the antioxidant peroxiredoxin II. Ki67 positive cells were also observed predominantly outside the fibroblastic foci, suggesting that the fibroblastic foci may not be composed of actively proliferating cells. When inhibition of profibrotic PDGF-signalling by imatinib mesylate was assessed, imatinib mesylate reduced asbestos-induced pulmonary fibrosis in mice as well as human pulmonary fibroblast migration in vitro but it had no effect on the lung inflammation.
Resumo:
Objectives In China, “serious road traffic crashes” (SRTCs) are those in which there are 10-30 fatalities, 50-100 serious injuries or a total cost of 50-100 million RMB ($US8-16m), and “particularly serious road traffic crashes” (PSRTCs) are those which are more severe or costly. Due to the large number of fatalities and injuries as well as the negative public reaction they elicit, SRTCs and PSRTCs have become great concerns to China during recent years. The aim of this study is to identify the main factors contributing to these road traffic crashes and to propose preventive measures to reduce their number. Methods 49 contributing factors of the SRTCs and PSRTCs that occurred from 2007 to 2013 were collected from the database “In-depth Investigation and Analysis System for Major Road traffic crashes” (IIASMRTC) and were analyzed through the integrated use of principal component analysis and hierarchical clustering to determine the primary and secondary groups of contributing factors. Results Speeding and overloading of passengers were the primary contributing factors, featuring in up to 66.3% and 32.6% of accidents respectively. Two secondary contributing factors were road-related: lack of or nonstandard roadside safety infrastructure, and slippery roads due to rain, snow or ice. Conclusions The current approach to SRTCs and PSRTCs is focused on the attribution of responsibility and the enforcement of regulations considered relevant to particular SRTCs and PSRTCs. It would be more effective to investigate contributing factors and characteristics of SRTCs and PSRTCs as a whole, to provide adequate information for safety interventions in regions where SRTCs and PSRTCs are more common. In addition to mandating of a driver training program and publicisation of the hazards associated with traffic violations, implementation of speed cameras, speed signs, markings and vehicle-mounted GPS are suggested to reduce speeding of passenger vehicles, while increasing regular checks by traffic police and passenger station staff, and improving transportation management to increase income of contractors and drivers are feasible measures to prevent overloading of people. Other promising measures include regular inspection of roadside safety infrastructure, and improving skid resistance on dangerous road sections in mountainous areas.
Resumo:
Visual content is a critical component of everyday social media, on platforms explicitly framed around the visual (Instagram and Vine), on those offering a mix of text and images in myriad forms (Facebook, Twitter, and Tumblr), and in apps and profiles where visual presentation and provision of information are important considerations. However, despite being so prominent in forms such as selfies, looping media, infographics, memes, online videos, and more, sociocultural research into the visual as a central component of online communication has lagged behind the analysis of popular, predominantly text-driven social media. This paper underlines the increasing importance of visual elements to digital, social, and mobile media within everyday life, addressing the significant research gap in methods for tracking, analysing, and understanding visual social media as both image-based and intertextual content. In this paper, we build on our previous methodological considerations of Instagram in isolation to examine further questions, challenges, and benefits of studying visual social media more broadly, including methodological and ethical considerations. Our discussion is intended as a rallying cry and provocation for further research into visual (and textual and mixed) social media content, practices, and cultures, mindful of both the specificities of each form, but also, and importantly, the ongoing dialogues and interrelations between them as communication forms.
Resumo:
Modern-day weather forecasting is highly dependent on Numerical Weather Prediction (NWP) models as the main data source. The evolving state of the atmosphere with time can be numerically predicted by solving a set of hydrodynamic equations, if the initial state is known. However, such a modelling approach always contains approximations that by and large depend on the purpose of use and resolution of the models. Present-day NWP systems operate with horizontal model resolutions in the range from about 40 km to 10 km. Recently, the aim has been to reach operationally to scales of 1 4 km. This requires less approximations in the model equations, more complex treatment of physical processes and, furthermore, more computing power. This thesis concentrates on the physical parameterization methods used in high-resolution NWP models. The main emphasis is on the validation of the grid-size-dependent convection parameterization in the High Resolution Limited Area Model (HIRLAM) and on a comprehensive intercomparison of radiative-flux parameterizations. In addition, the problems related to wind prediction near the coastline are addressed with high-resolution meso-scale models. The grid-size-dependent convection parameterization is clearly beneficial for NWP models operating with a dense grid. Results show that the current convection scheme in HIRLAM is still applicable down to a 5.6 km grid size. However, with further improved model resolution, the tendency of the model to overestimate strong precipitation intensities increases in all the experiment runs. For the clear-sky longwave radiation parameterization, schemes used in NWP-models provide much better results in comparison with simple empirical schemes. On the other hand, for the shortwave part of the spectrum, the empirical schemes are more competitive for producing fairly accurate surface fluxes. Overall, even the complex radiation parameterization schemes used in NWP-models seem to be slightly too transparent for both long- and shortwave radiation in clear-sky conditions. For cloudy conditions, simple cloud correction functions are tested. In case of longwave radiation, the empirical cloud correction methods provide rather accurate results, whereas for shortwave radiation the benefit is only marginal. Idealised high-resolution two-dimensional meso-scale model experiments suggest that the reason for the observed formation of the afternoon low level jet (LLJ) over the Gulf of Finland is an inertial oscillation mechanism, when the large-scale flow is from the south-east or west directions. The LLJ is further enhanced by the sea-breeze circulation. A three-dimensional HIRLAM experiment, with a 7.7 km grid size, is able to generate a similar LLJ flow structure as suggested by the 2D-experiments and observations. It is also pointed out that improved model resolution does not necessary lead to better wind forecasts in the statistical sense. In nested systems, the quality of the large-scale host model is really important, especially if the inner meso-scale model domain is small.
Resumo:
Boron neutron capture therapy (BNCT) is a form of chemically targeted radiotherapy that utilises the high neutron capture cross-section of boron-10 isotope to achieve a preferential dose increase in the tumour. The BNCT dosimetry poses a special challenge as the radiation dose absorbed by the irradiated tissues consists of several dose different components. Dosimetry is important as the effect of the radiation on the tissue is correlated with the radiation dose. Consistent and reliable radiation dose delivery and dosimetry are thus basic requirements for radiotherapy. The international recommendations for are not directly applicable to BNCT dosimetry. The existing dosimetry guidance for BNCT provides recommendations but also calls for investigating for complementary methods for comparison and improved accuracy. In this thesis the quality assurance and stability measurements of the neutron beam monitors used in dose delivery are presented. The beam monitors were found not to be affected by the presence of a phantom in the beam and that the effect of the reactor core power distribution was less than 1%. The weekly stability test with activation detectors has been generally reproducible within the recommended tolerance value of 2%. An established toolkit for epithermal neutron beams for determination of the dose components is presented and applied in an international dosimetric intercomparison. The measured quantities (neutron flux, fast neutron and photon dose) by the groups in the intercomparison were generally in agreement within the stated uncertainties. However, the uncertainties were large, ranging from 3-30% (1 standard deviation), emphasising the importance of dosimetric intercomparisons if clinical data is to be compared between different centers. Measurements with the Exradin type 2M ionisation chamber have been repeated in the epithermal neutron beam in the same measurement configuration over the course of 10 years. The presented results exclude severe sensitivity changes to thermal neutrons that have been reported for this type of chamber. Microdosimetry and polymer gel dosimetry as complementary methods for epithermal neutron beam dosimetry are studied. For microdosimetry the comparison of results with ionisation chambers and computer simulation showed that the photon dose measured with microdosimetry was lower than with the two other methods. The disagreement was within the uncertainties. For neutron dose the simulation and microdosimetry results agreed within 10% while the ionisation chamber technique gave 10-30% lower neutron dose rates than the two other methods. The response of the BANG-3 gel was found to be linear for both photon and epithermal neutron beam irradiation. The dose distribution normalised to dose maximum measured by MAGIC polymer gel was found to agree well with the simulated result near the dose maximum while the spatial difference between measured and simulated 30% isodose line was more than 1 cm. In both the BANG-3 and MAGIC gel studies, the interpretation of the results was complicated by the presence of high-LET radiation.
Resumo:
Background The leading causes of morbidity and mortality for people in high-income countries living with HIV are now non-AIDS malignancies, cardiovascular disease and other non-communicable diseases associated with ageing. This protocol describes the trial of HealthMap, a model of care for people with HIV (PWHIV) that includes use of an interactive shared health record and self-management support. The aims of the HealthMap trial are to evaluate engagement of PWHIV and healthcare providers with the model, and its effectiveness for reducing coronary heart disease risk, enhancing self-management, and improving mental health and quality of life of PWHIV. Methods/Design The study is a two-arm cluster randomised trial involving HIV clinical sites in several states in Australia. Doctors will be randomised to the HealthMap model (immediate arm) or to proceed with usual care (deferred arm). People with HIV whose doctors are randomised to the immediate arm receive 1) new opportunities to discuss their health status and goals with their HIV doctor using a HealthMap shared health record; 2) access to their own health record from home; 3) access to health coaching delivered by telephone and online; and 4) access to a peer moderated online group chat programme. Data will be collected from participating PWHIV (n = 710) at baseline, 6 months, and 12 months and from participating doctors (n = 60) at baseline and 12 months. The control arm will be offered the HealthMap intervention at the end of the trial. The primary study outcomes, measured at 12 months, are 1) 10-year risk of non-fatal acute myocardial infarction or coronary heart disease death as estimated by a Framingham Heart Study risk equation; and 2) Positive and Active Engagement in Life Scale from the Health Education Impact Questionnaire (heiQ). Discussion The study will determine the viability and utility of a novel technology-supported model of care for maintaining the health and wellbeing of people with HIV. If shown to be effective, the HealthMap model may provide a generalisable, scalable and sustainable system for supporting the care needs of people with HIV, addressing issues of equity of access. Trial registration Universal Trial Number (UTN) U111111506489; ClinicalTrial.gov Id NCT02178930 submitted 29 June 2014
Resumo:
We find sandwiched metal dimers CB5H6M–MCB5H6 (M = Si, Ge, Sn) which are minima in the potential energy surface with a characteristic M–M single bond. The NBO analysis and the M–M distances (Å) (2.3, 2.44 and 2.81 for M = Si, Ge, Sn) indicate substantial M–M bonding. Formal generation of CB5H6M–MCB5H6 has been studied theoretically. Consecutive substitution of two boron atoms in B7H−27 by M (Si, Ge, Sn) and carbon, respectively followed by dehydrogenation may lead to our desired CB5H6M–MCB5H6. We find that the slip distorted geometry is preferred for MCB5H7 and its dehydrogenated dimer CB5H6M–MCB5H6. The slip-distortion of M–M bond in CB5H6M–MCB5H6 is more than the slip distortion of M–H bond in MCB5H7. Molecular orbital analysis has been done to understand the slip distortion. Larger M–M bending (CB5H6M–MCB5H6) in comparison with M–H bending (MCB5H7) is suspected to be encouraged by stabilization of one of the M–M π bonding MO’s. Preference of M to occupy the apex of pentagonal skeleton of MCB5H7 over its icosahedral analogue MCB10H11 has been observed.
Resumo:
The increased accuracy in the cosmological observations, especially in the measurements of the comic microwave background, allow us to study the primordial perturbations in grater detail. In this thesis, we allow the possibility for a correlated isocurvature perturbations alongside the usual adiabatic perturbations. Thus far the simplest six parameter \Lambda CDM model has been able to accommodate all the observational data rather well. However, we find that the 3-year WMAP data and the 2006 Boomerang data favour a nonzero nonadiabatic contribution to the CMB angular power sprctrum. This is primordial isocurvature perturbation that is positively correlated with the primordial curvature perturbation. Compared with the adiabatic \Lambda CMD model we have four additional parameters describing the increased complexity if the primordial perturbations. Our best-fit model has a 4% nonadiabatic contribution to the CMB temperature variance and the fit is improved by \Delta\chi^2 = 9.7. We can attribute this preference for isocurvature to a feature in the peak structure of the angular power spectrum, namely, the widths of the second and third acoustic peak. Along the way, we have improved our analysis methods by identifying some issues with the parametrisation of the primordial perturbation spectra and suggesting ways to handle these. Due to the improvements, the convergence of our Markov chains is improved. The change of parametrisation has an effect on the MCMC analysis because of the change in priors. We have checked our results against this and find only marginal differences between our parametrisation.
Resumo:
An efficient and statistically robust solution for the identification of asteroids among numerous sets of astrometry is presented. In particular, numerical methods have been developed for the short-term identification of asteroids at discovery, and for the long-term identification of scarcely observed asteroids over apparitions, a task which has been lacking a robust method until now. The methods are based on the solid foundation of statistical orbital inversion properly taking into account the observational uncertainties, which allows for the detection of practically all correct identifications. Through the use of dimensionality-reduction techniques and efficient data structures, the exact methods have a loglinear, that is, O(nlog(n)), computational complexity, where n is the number of included observation sets. The methods developed are thus suitable for future large-scale surveys which anticipate a substantial increase in the astrometric data rate. Due to the discontinuous nature of asteroid astrometry, separate sets of astrometry must be linked to a common asteroid from the very first discovery detections onwards. The reason for the discontinuity in the observed positions is the rotation of the observer with the Earth as well as the motion of the asteroid and the observer about the Sun. Therefore, the aim of identification is to find a set of orbital elements that reproduce the observed positions with residuals similar to the inevitable observational uncertainty. Unless the astrometric observation sets are linked, the corresponding asteroid is eventually lost as the uncertainty of the predicted positions grows too large to allow successful follow-up. Whereas the presented identification theory and the numerical comparison algorithm are generally applicable, that is, also in fields other than astronomy (e.g., in the identification of space debris), the numerical methods developed for asteroid identification can immediately be applied to all objects on heliocentric orbits with negligible effects due to non-gravitational forces in the time frame of the analysis. The methods developed have been successfully applied to various identification problems. Simulations have shown that the methods developed are able to find virtually all correct linkages despite challenges such as numerous scarce observation sets, astrometric uncertainty, numerous objects confined to a limited region on the celestial sphere, long linking intervals, and substantial parallaxes. Tens of previously unknown main-belt asteroids have been identified with the short-term method in a preliminary study to locate asteroids among numerous unidentified sets of single-night astrometry of moving objects, and scarce astrometry obtained nearly simultaneously with Earth-based and space-based telescopes has been successfully linked despite a substantial parallax. Using the long-term method, thousands of realistic 3-linkages typically spanning several apparitions have so far been found among designated observation sets each spanning less than 48 hours.
Resumo:
Background There is a strong link between antibiotic consumption and the rate of antibiotic resistance. In Australia, the vast majority of antibiotics are prescribed by general practitioners, and the most common indication is for acute respiratory infections. The aim of this study is to assess if implementing a package of integrated, multifaceted interventions reduces antibiotic prescribing for acute respiratory infections in general practice. Methods/design This is a cluster randomised trial comparing two parallel groups of general practitioners in 28 urban general practices in Queensland, Australia: 14 intervention and 14 control practices. The protocol was peer-reviewed by content experts who were nominated by the funding organization. This study evaluates an integrated, multifaceted evidence-based package of interventions implemented over a six month period. The included interventions, which have previously been demonstrated to be effective at reducing antibiotic prescribing for acute respiratory infections, are: delayed prescribing; patient decision aids; communication training; commitment to a practice prescribing policy for antibiotics; patient information leaflet; and near patient testing with C-reactive protein. In addition, two sub-studies are nested in the main study: (1) point prevalence estimation carriage of bacterial upper respiratory pathogens in practice staff and asymptomatic patients; (2) feasibility of direct measures of antibiotic resistance by nose/throat swabbing. The main outcome data are from Australia’s national health insurance scheme, Medicare, which will be accessed after the completion of the intervention phase. They include the number of antibiotic prescriptions and the number of patient visits per general practitioner for periods before and during the intervention. The incidence of antibiotic prescriptions will be modelled using the numbers of patients as the denominator and seasonal and other factors as explanatory variables. Results will compare the change in prescription rates before and during the intervention in the two groups of practices. Semi-structured interviews will be conducted with the general practitioners and practice staff (practice nurse and/or practice manager) from the intervention practices on conclusion of the intervention phase to assess the feasibility and uptake of the interventions. An economic evaluation will be conducted to estimate the costs of implementing the package, and its cost-effectiveness in terms of cost per unit reduction in prescribing. Discussion The results on the effectiveness, cost-effectiveness, acceptability and feasibility of this package of interventions will inform the policy for any national implementation.
Resumo:
"We thank MrGilder for his considered comments and suggestions for alternative analyses of our data. We also appreciate Mr Gilder’s support of our call for larger studies to contribute to the evidence base for preoperative loading with high-carbohydrate fluids..."
Resumo:
Foliage density and leaf area index are important vegetation structure variables. They can be measured by several methods but few have been tested in tropical forests which have high structural heterogeneity. In this study, foliage density estimates by two indirect methods, the point quadrat and photographic methods, were compared with those obtained by direct leaf counts in the understorey of a wet evergreen forest in southern India. The point quadrat method has a tendency to overestimate, whereas the photographic method consistently and ignificantly underestimates foliage density. There was stratification within the understorey, with areas close to the ground having higher foliage densities.