242 resultados para Difficult-to-Measure Nuclides
Resumo:
Access to transport systems and the connection to such systems provided to essential economic and social activities are critical to determine households' transportation disadvantage levels. In spite of the developments in better identifying transportation disadvantaged groups, the lack of effective policies resulted in the continuum of the issue as a significant problem. This paper undertakes a pilot case investigation as test bed for a new approach developed to reduce transportation policy shortcomings. The approach, ‘disadvantage-impedance index’, aims to ease transportation disadvantages by employing representative parameters to measure the differences between policy alternatives run in a simulation environment. Implemented in the Japanese town of Arao, the index uses trip-making behaviour and resident stated preference data. The results of the index reveal that even a slight improvement in accessibility and travel quality indicators makes a significant difference in easing disadvantages. The index, integrated into a four-step model, proves to be highly robust and useful in terms of quick diagnosis in capturing effective actions, and developing potentially efficient policies.
Resumo:
Background Context There are differences in definitions of end plate lesions (EPLs), often referred to as Schmorl’s nodes, that may, to some extent, account for the large range of reported prevalence (3.8 to 76%). Purpose To develop a technique to measure the size, prevalence and location of EPLs in a consistent manner. Study Design/Setting This study proposed a method using a detection algorithm which was applied to five adolescent females (average age 15.1 years, range 13.0 to 19.2 years) with idiopathic scoliosis (average major Cobb angle 60°, range 55 to 67°). Methods Existing low-dose, computed tomography scans were segmented semi-automatically to extract 3D morphology of each vertebral endplate. Any remaining attachments to the posterior elements of adjacent vertebrae or endplates were then manually sectioned. An automatic algorithm was used to determine the presence and position of EPLs. Results EPLs were identified in 15 of the 170 (8.8%) endplates analysed with an average depth of 3.1mm. 11/15 of the EPLs were seen in the lumbar spine. The algorithm was found to be most sensitive to changes in the minimum EPL gradient at the edges of the EPL. Conclusions This study describes an imaging analysis technique for consistent measurement of the prevalence, location and size of EPLs. The technique can be used to analyse large populations without observer errors in EPL definitions.
Resumo:
The visual characteristics of urban environments have been changing dramatically with the growth of cities around the world. Protection and enhancement of landscape character in urban environments have been one of the challenges for policy makers in addressing sustainable urban growth. Visual openness and enclosure in urban environments are important attributes in perception of visual space which affect the human interaction with physical space and which can be often modified by new developments. Measuring visual openness in urban areas results in more accurate, reliable, and systematic approach to manage and control visual qualities in growing cities. Recent advances in techniques in geographic information systems (GIS) and survey systems make it feasible to measure and quantify this attribute with a high degree of realism and precision. Previous studies in this field do not take full advantage of these improvements. This paper proposes a method to measure the visual openness and enclosure in a changing urban landscape in Australia, on the Gold Coast, by using the improved functionality in GIS. Using this method, visual openness is calculated and described for all publicly accessible areas in the selected study area. A final map is produced which shows the areas with highest visual openness and visibility to natural landscape resources. The output of this research can be used by planners and decision-makers in managing and controlling views in complex urban landscapes. Also, depending on the availability of GIS data, this method can be applied to any region including non-urban landscapes to help planners and policy-makers manage views and visual qualities.
Resumo:
The concept of big data has already outperformed traditional data management efforts in almost all industries. Other instances it has succeeded in obtaining promising results that provide value from large-scale integration and analysis of heterogeneous data sources for example Genomic and proteomic information. Big data analytics have become increasingly important in describing the data sets and analytical techniques in software applications that are so large and complex due to its significant advantages including better business decisions, cost reduction and delivery of new product and services [1]. In a similar context, the health community has experienced not only more complex and large data content, but also information systems that contain a large number of data sources with interrelated and interconnected data attributes. That have resulted in challenging, and highly dynamic environments leading to creation of big data with its enumerate complexities, for instant sharing of information with the expected security requirements of stakeholders. When comparing big data analysis with other sectors, the health sector is still in its early stages. Key challenges include accommodating the volume, velocity and variety of healthcare data with the current deluge of exponential growth. Given the complexity of big data, it is understood that while data storage and accessibility are technically manageable, the implementation of Information Accountability measures to healthcare big data might be a practical solution in support of information security, privacy and traceability measures. Transparency is one important measure that can demonstrate integrity which is a vital factor in the healthcare service. Clarity about performance expectations is considered to be another Information Accountability measure which is necessary to avoid data ambiguity and controversy about interpretation and finally, liability [2]. According to current studies [3] Electronic Health Records (EHR) are key information resources for big data analysis and is also composed of varied co-created values [3]. Common healthcare information originates from and is used by different actors and groups that facilitate understanding of the relationship for other data sources. Consequently, healthcare services often serve as an integrated service bundle. Although a critical requirement in healthcare services and analytics, it is difficult to find a comprehensive set of guidelines to adopt EHR to fulfil the big data analysis requirements. Therefore as a remedy, this research work focus on a systematic approach containing comprehensive guidelines with the accurate data that must be provided to apply and evaluate big data analysis until the necessary decision making requirements are fulfilled to improve quality of healthcare services. Hence, we believe that this approach would subsequently improve quality of life.
Resumo:
Hamstring strain injuries are the predominant injury in many sports, costing athletes and clubs a significant financial and performance burden. Therefore the ability to identify and intervene with individuals who are considered at a high risk of injury is important. One measure which has grown in popularity as an outcome variable following hamstring intervention/prevention studies and rehabilitation is the angle of peak knee flexor torque. This current opinion article will firstly introduce the measure and the processes behind it. Secondly, this article will summarise how the angle of peak knee flexor torque has been suggested to measure hamstring strain injury risk. Finally various limitations will be presented and outlined as to how they may influence the measure. These include the lack of muscle specificity, the common concentric contraction mode of assessment, reliability of the measure, various neural contributions (such as rate of force development and neuromuscular inhibition) as well as the lack of prospective data showing any predictive value in the measure.
Resumo:
INTRODUCTION Standing radiographs are the ‘gold standard’ for clinical assessment of adolescent idiopathic scoliosis (AIS), with the Cobb Angle used to measure the severity and progression of the scoliotic curve. Supine imaging modalities can provide valuable 3D information on scoliotic anatomy, however, due to changes in gravitational loading direction, the geometry of the spine alters between the supine and standing position which in turn affects the Cobb Angle measurement. Previous studies have consistently reported a 7-10° [1-3] Cobb Angle increase from supine to standing, however, none have reported the effect of endplate pre-selection and which (if any) curve parameters affect the supine to standing Cobb Angle difference. CONCLUSION There is a statistically significant relationship between supine to standing Cobb Angle change and fulcrum flexibility. Therefore, this difference can be considered a measure of spinal flexibility. Pre-selecting vertebral endplates causes only minor changes.
Resumo:
Large Display Arrays (LDAs) use Light Emitting Diodes (LEDs) in order to inform a viewing audience. A matrix of individually driven LEDs allows the area represented to display text, images and video. LDAs have undergone rapid development over the past 10 years in both the modular and semi-flexible formats. This thesis critically analyses the communication architecture and processor functionality of current LDAs and presents an alternative method, that is, Scalable Flexible Large Display Arrays (SFLDAs). SFLDAs are more adaptable to a variety of applications because of enhancements in scalability and flexibility. Scalability is the ability to configure SFLDAs from 0.8m2 to 200m2. Flexibility is increased functionality within the processors to handle changes in configuration and the use of a communication architecture that standardises two-way communication throughout the SFLDA. While common video platforms such as Digital Video Interface (DVI), Serial Digital Interface (SDI), and High Definition Multimedia Interface (HDMI) are considered as solutions for the communication architecture of SFLDAs, so too is modulation, fibre optic, capacitive coupling and Ethernet. From an analysis of these architectures, Ethernet was identified as the best solution. The use of Ethernet as the communication architecture in SFLDAs means that both hardware and software modules are capable of interfacing to the SFLDAs. The Video to Ethernet Processor Unit (VEPU), Scoreboard, Image and Control Software (SICS) and Ethernet to LED Processor Unit (ELPU) have been developed to form the key components in designing and implementing the first SFLDA. Data throughput rate and spectrophotometer tests were used to measure the effectiveness of Ethernet within the SFLDA constructs. The result of testing and analysis of these architectures showed that Ethernet satisfactorily met the requirements of SFLDAs.
Resumo:
Big Datasets are endemic, but they are often notoriously difficult to analyse because of their size, heterogeneity, history and quality. The purpose of this paper is to open a discourse on the use of modern experimental design methods to analyse Big Data in order to answer particular questions of interest. By appealing to a range of examples, it is suggested that this perspective on Big Data modelling and analysis has wide generality and advantageous inferential and computational properties. In particular, the principled experimental design approach is shown to provide a flexible framework for analysis that, for certain classes of objectives and utility functions, delivers near equivalent answers compared with analyses of the full dataset under a controlled error rate. It can also provide a formalised method for iterative parameter estimation, model checking, identification of data gaps and evaluation of data quality. Finally, it has the potential to add value to other Big Data sampling algorithms, in particular divide-and-conquer strategies, by determining efficient sub-samples.
Resumo:
Background We have previously demonstrated that circulating NT-proBNP is truncated at the N and C termini. Aims of this study are three-fold: firstly to determine whether the NT-proBNP levels correlate with NYHA functional classes when measuring with different antibody pairs; secondly to evaluate the diagnostic potential of ProBNP and; thirdly to investigate whether combining NT-proBNP assays with or without ProBNP would lead to better diagnostic accuracies. Methods Plasma samples were collected from healthy controls (n = 52) and HF patients (n = 46). Customized AlphaLISA® immunoassays were developed and validated to measure the concentrations of proBNP and NT-proBNP (with antibodies targeting 13–45, 13–76, 28–76). The diagnostic performance and predictive value of proBNP and NT-proBNP assays and their combinations were evaluated. Results Plasma proBNP assay showed acceptable diagnostic performance. NT-proBNP13–76 assay is useful in diagnosing and stratifying HF patients. The diagnostic performance of NT-proBNP13–76 demonstrated improvement over commercial NT-proBNP tests. The combination of NT-proBNP13–76 with NT-proBNP28–76 assays gave the best diagnostic assay performance. Conclusion Our results demonstrate that while there is major heterogeneity in circulating NT-proBNP, specific epitopes of the peptides are extraordinarily stable, providing ideal targets for clinically useful diagnostic assays. Future new clinical diagnostic clinical trials should include a multimarker approach rather than using a single marker to diagnose HF.
Resumo:
It is difficult to determine sulfur-containing volatile organic compounds in the atmosphere because of their reactivity. Primary off-line techniques may suffer losses of analytes during the transportation from field to laboratory and sample preparation. In this study, a novel method was developed to directly measure dimethyl sulfide at parts-per-billion concentration levels in the atmosphere using vacuum ultraviolet single photon ionization time-of-flight mass spectrometry. This technique offers continuous sampling at a response rate of one measurement per second, or cumulative measurements over longer time periods. Laboratory prepared samples of different concentrations of dimethyl sulfide in pure nitrogen gas were analyzed at several sampling frequencies. Good precision was achieved using sampling periods of at least 60 seconds with a relative standard deviation of less than 25%. The detection limit for dimethyl sulfide was below the 3 ppb olfactory threshold. These results demonstrate that single photon ionization time-of-flight mass spectrometry is a valuable tool for rapid, real-time measurements of sulfur-containing organic compounds in the air.
Resumo:
Chlamydial infections are wide spread in koalas across their range and a solution to this debilitating disease has been sought for over a decade. Antibiotics are the currently accepted therapeutic measure, but are not an effective treatment due to the asymptomatic nature of some infections and a low efficacy rate. Thus, a vaccine would be an ideal way to address this infectious disease threat in the wild. Previous vaccine trials have used a three-dose regimen; however this is very difficult to apply in the field as it would require multiple capture events, which are stressful and invasive processes for the koala. In addition, it requires skilled koala handlers and a significant monetary investment. To overcome these challenges, in this study we utilized a polyphosphazine based poly I:C and a host defense peptide adjuvant combined with recombinant chlamydial major outer membrane protein (rMOMP) antigen to induce long lasting (54 weeks) cellular and humoral immunity in female koalas with a novel single immunizing dose. Immunized koalas produced a strong IgG response in plasma, as well as at mucosal sites. Moreover, they showed high levels of C. pecorum specific neutralizing antibodies in the plasma as well as vaginal and conjunctival secretions. Lastly, Chlamydia-specific lymphocyte proliferation responses were produced against both whole chlamydial elementary bodies and rMOMP protein, over the 12-month period. The results of this study suggest that a single dose rMOMP vaccine incorporating a poly I:C, host defense peptide and polyphosphazine adjuvant is able to stimulate both arms of the immune system in koalas, thereby providing an alternative to antibiotic treatment and/or a three-dose vaccine regime.
Resumo:
Background The various cell types and their relative numbers in multicellular organisms are controlled by growth factors and related extracellular molecules which affect genetic expression pathways. However, these substances may have both/either inhibitory and/or stimulatory effects on cell division and cell differentiation depending on the cellular environment. It is not known how cells respond to these substances in such an ambiguous way. Many cellular effects have been investigated and reported using cell culture from cancer cell lines in an effort to define normal cellular behaviour using these abnormal cells. A model is offered to explain the harmony of cellular life in multicellular organisms involving interacting extracellular substances. Methods A basic model was proposed based on asymmetric cell division and evidence to support the hypothetical model was accumulated from the literature. In particular, relevant evidence was selected for the Insulin-Like Growth Factor system from the published data, especially from certain cell lines, to support the model. The evidence has been selective in an attempt to provide a picture of normal cellular responses, derived from the cell lines. Results The formation of a pair of coupled cells by asymmetric cell division is an integral part of the model as is the interaction of couplet molecules derived from these cells. Each couplet cell will have a receptor to measure the amount of the couplet molecule produced by the other cell; each cell will be receptor-positive or receptor-negative for the respective receptors. The couplet molecules will form a binary complex whose level is also measured by the cell. The hypothesis is heavily supported by selective collection of circumstantial evidence and by some direct evidence. The basic model can be expanded to other cellular interactions. Conclusions These couplet cells and interacting couplet molecules can be viewed as a mechanism that provides a controlled and balanced division-of-labour between the two progeny cells, and, in turn, their progeny. The presence or absence of a particular receptor for a couplet molecule will define a cell type and the presence or absence of many such receptors will define the cell types of the progeny within cell lineages.
Resumo:
Concept mapping involves determining relevant concepts from a free-text input, where concepts are defined in an external reference ontology. This is an important process that underpins many applications for clinical information reporting, derivation of phenotypic descriptions, and a number of state-of-the-art medical information retrieval methods. Concept mapping can be cast into an information retrieval (IR) problem: free-text mentions are treated as queries and concepts from a reference ontology as the documents to be indexed and retrieved. This paper presents an empirical investigation applying general-purpose IR techniques for concept mapping in the medical domain. A dataset used for evaluating medical information extraction is adapted to measure the effectiveness of the considered IR approaches. Standard IR approaches used here are contrasted with the effectiveness of two established benchmark methods specifically developed for medical concept mapping. The empirical findings show that the IR approaches are comparable with one benchmark method but well below the best benchmark.
Resumo:
Background Adherence to hypertension management in patients with hypertension is known to influence their blood pressure control. It is important to measure patients’ adherence behaviours to assist with designing appropriate interventions to improve blood pressure control. Aims The purposes of this study were to use confirmatory factor analysis to revalidate the Therapeutic Adherence Scale for Hypertensive Patients (TASHP), and to calculate the cut-off score for classifying adherence behaviours into two groups: satisfactory and low adherence behaviours. Methods Systematic random sampling was used to recruit patients with hypertension in China. Demographic characteristics, the TASHP and blood pressure were collected. The psychometric tests of the TASHP included: construct validity, criteria-related validity, internal reliability, and split-half reliability. The area under the receiver operating characteristics curve and Youden index were used to identify the cut-off score of the TASHP for blood pressure control. Results This study involved 366 patients. Confirmatory factor analysis supported the four-component structure of the TASHP proposed in the original scale development study. The TASHP has a satisfactory internal reliability (Cronbach’s α > 0.7) and a satisfactory split-half reliability (Spearman–Brown coefficients > 0.7). The patients with overall scores of the TASHP ⩾ 109 points were considered to have satisfactory adherence behaviours. Conclusion The TASHP is a validated and reliable instrument to measure the adherence to hypertension management in Chinese patients with hypertension. The cut-off score of 109 points can be considered as an effective measure to classify the level of adherence into satisfactory and low adherence behaviours.
Identifying relevant information for emergency services from twitter in response to natural disaster
Resumo:
This project proposes a framework that identifies high‐value disaster-based information from social media to facilitate key decision-making processes during natural disasters. At present it is very difficult to differentiate between information that has a high degree of disaster relevance and information that has a low degree of disaster relevance. By digitally harvesting and categorising social media conversation streams automatically, this framework identifies highly disaster-relevant information that can be used by emergency services for intelligence gathering and decision-making.