106 resultados para 256
Resumo:
It is a big challenge to clearly identify the boundary between positive and negative streams for information filtering systems. Several attempts have used negative feedback to solve this challenge; however, there are two issues for using negative relevance feedback to improve the effectiveness of information filtering. The first one is how to select constructive negative samples in order to reduce the space of negative documents. The second issue is how to decide noisy extracted features that should be updated based on the selected negative samples. This paper proposes a pattern mining based approach to select some offenders from the negative documents, where an offender can be used to reduce the side effects of noisy features. It also classifies extracted features (i.e., terms) into three categories: positive specific terms, general terms, and negative specific terms. In this way, multiple revising strategies can be used to update extracted features. An iterative learning algorithm is also proposed to implement this approach on the RCV1 data collection, and substantial experiments show that the proposed approach achieves encouraging performance and the performance is also consistent for adaptive filtering as well.
Resumo:
Features derived from the trispectra of DFT magnitude slices are used for multi-font digit recognition. These features are insensitive to translation, rotation, or scaling of the input. They are also robust to noise. Classification accuracy tests were conducted on a common data base of 256× 256 pixel bilevel images of digits in 9 fonts. Randomly rotated and translated noisy versions were used for training and testing. The results indicate that the trispectral features are better than moment invariants and affine moment invariants. They achieve a classification accuracy of 95% compared to about 81% for Hu's (1962) moment invariants and 39% for the Flusser and Suk (1994) affine moment invariants on the same data in the presence of 1% impulse noise using a 1-NN classifier. For comparison, a multilayer perceptron with no normalization for rotations and translations yields 34% accuracy on 16× 16 pixel low-pass filtered and decimated versions of the same data.
Resumo:
Background: Bioimpedance techniques provide a reliable method of assessing unilateral lymphedema in a clinical setting. Bioimpedance devices are traditionally used to assess body composition at a current frequency of 50 kHz. However, these devices are not transferable to the assessment of lymphedema, as the sensitivity of measuring the impedance of extracellular fluid is frequency dependent. It has previously been shown that the best frequency to detect extracellular fluid is 0 kHz (or DC). However, measurement at this frequency is not possible in practice due to the high skin impedance at DC, and an estimate is usually determined from low frequency measurements. This study investigated the efficacy of various low frequency ranges for the detection of lymphedema. Methods and Results: Limb impedance was measured at 256 frequencies between 3 kHz and 1000 kHz for a sample control population, arm lymphedema population, and leg lymphedema population. Limb impedance was measured using the ImpediMed SFB7 and ImpediMed L-Dex® U400 with equipotential electrode placement on the wrists and ankles. The contralateral limb impedance ratio for arms and legs was used to calculate a lymphedema index (L-Dex) at each measurement frequency. The standard deviation of the limb impedance ratio in a healthy control population has been shown to increase with frequency for both the arm and leg. Box and whisker plots of the spread of the control and lymphedema populations show that there exists good differentiation between the arm and leg L-Dex measured for lymphedema subjects and the arm and leg L-Dex measured for control subjects up to a frequency of about 30 kHz. Conclusions: It can be concluded that impedance measurements above a frequency of 30 kHz decrease sensitivity to extracellular fluid and are not reliable for early detection of lymphedema.
Resumo:
Background: Patient privacy and confidentiality (PPaC) is an important consideration for nurses and other members of the health care team. Can a patient expect to have confidentiality and in particular privacy in the current climate of emergency health care? Do staff who work in the Emergency Department (ED) see confidentiality as an important factor when providing emergency care? These questions are important to consider. Methods: This is a two phased quality improvement project, developed and implemented over a six month period in a busy regional, tertiary referral ED. Results: Issues identified for this department included department design and layout, overcrowding due to patient flow and access block, staff practices and department policies which were also impacted upon by culture of the team, and use of space. Conclusions: Changes successful in improving this issue include increased staff awareness about PPaC, intercom paging prior to nursing handover to remove visitors during handover, one visitor per patient policy, designated places for handover, allocated bed space for patient reviews/assessment and a strategy to temporarily move the patient if procedures would have been undertaken in shared bed space. These are important issues when considering policy, practice and department design in the ED.
Resumo:
The FANCA gene is one of the genes in which mutations lead to Fanconi anaemia, a rare autosomal recessive disorder characterised by congenital abnormalities, bone marrow failure, and predisposition to malignancy. FANCA is also a potential breast and ovarian cancer susceptibility gene. A novel allele was identified which has a tandem duplication of a 13 base pair sequence in the promoter region. Methods: We screened germline DNA from 352 breast cancer patients, 390 ovarian cancer patients and 256 normal controls to determine if the presence of either of these two alleles was associated with an increased risk of breast or ovarian cancer. Results: The duplication allele had a frequency of 0.34 in the normal controls. There was a nonsignificant decrease in the frequency of the duplication allele in breast cancer patients. The frequency of the duplication allele was significantly decreased in ovarian cancer patients. However, when malignant and benign tumours were considered separately, the decrease was only significant in benign tumours. Conclusion: The allele with the tandem duplication does not appear to modify breast cancer risk but may act as a low penetrance protective allele for ovarian cancer.
Resumo:
At cryogenic temperature, a fiber Bragg grating (FBG) temperature sensor with controllable sensitivity and variable measurement range is demonstrated by using bimetal configuration. In experiments, sensitivities of -51.2, -86.4, and -520 pm/K are achieved by varying the lengths of the metals. Measurement ranges of 293-290.5, 283-280.5, and 259-256.5 K are achieved by shortening the distance of the gap among the metals.
Resumo:
Although mobile phones are often used in public urban places to interact with one’s geographically dispersed social circle, they can also facilitate interactions with people in the same public urban space. The PlaceTagz study investigates how physical artefacts in public urban places can be utilised and combined with mobile phone technologies to facilitate interactions. Printed on stickers, PlaceTagz are QR codes linking to a digital message board enabling collocated users to interact with each other over time resulting in a place-based digital memory. This exploratory project set out to investigate if and how PlaceTagz are used by urban dwellers in a real world deployment. We present findings from analysing content received through PlaceTagz and interview data from application users. QR codes, which do not contain any contextual information, piqued the curiosity of users wondering about the embedded link’s destination and provoked comments in regards to people, place and technology.
Resumo:
Social media and web 2.0 tools offer opportunities to devise novel participation strategies that can engage previously difficult to reach as well as new segments of society in urban planning. This paper examines participatory planning in the four local government areas of Brisbane City Council, Gold Coast City Council, Redland City Council, and Toowoomba Regional Council, all situated in South East Queensland, Australia. The paper discusses how social media and web 2.0 tools can deliver a more engaging planning experience to citizens, and investigates local government’s current use and receptiveness to social media tools for plan making and community engagement. The study’s research informed the development of criteria to assess the level of participation reached through the current use of social media and web 2.0 in the four local government areas. This resulted in an adaptation of the International Association for Public Participation (IAP2) Toolbox to integrate these new tools which is being presented to encourage further discussion and evaluation by planning professionals.
Resumo:
A simple and effective down-sample algorithm, Peak-Hold-Down-Sample (PHDS) algorithm is developed in this paper to enable a rapid and efficient data transfer in remote condition monitoring applications. The algorithm is particularly useful for high frequency Condition Monitoring (CM) techniques, and for low speed machine applications since the combination of the high sampling frequency and low rotating speed will generally lead to large unwieldy data size. The effectiveness of the algorithm was evaluated and tested on four sets of data in the study. One set of the data was extracted from the condition monitoring signal of a practical industry application. Another set of data was acquired from a low speed machine test rig in the laboratory. The other two sets of data were computer simulated bearing defect signals having either a single or multiple bearing defects. The results disclose that the PHDS algorithm can substantially reduce the size of data while preserving the critical bearing defect information for all the data sets used in this work even when a large down-sample ratio was used (i.e., 500 times down-sampled). In contrast, the down-sample process using existing normal down-sample technique in signal processing eliminates the useful and critical information such as bearing defect frequencies in a signal when the same down-sample ratio was employed. Noise and artificial frequency components were also induced by the normal down-sample technique, thus limits its usefulness for machine condition monitoring applications.
Resumo:
There is a growing interest in the use of megavoltage cone-beam computed tomography (MV CBCT) data for radiotherapy treatment planning. To calculate accurate dose distributions, knowledge of the electron density (ED) of the tissues being irradiated is required. In the case of MV CBCT, it is necessary to determine a calibration-relating CT number to ED, utilizing the photon beam produced for MV CBCT. A number of different parameters can affect this calibration. This study was undertaken on the Siemens MV CBCT system, MVision, to evaluate the effect of the following parameters on the reconstructed CT pixel value to ED calibration: the number of monitor units (MUs) used (5, 8, 15 and 60 MUs), the image reconstruction filter (head and neck, and pelvis), reconstruction matrix size (256 by 256 and 512 by 512), and the addition of extra solid water surrounding the ED phantom. A Gammex electron density CT phantom containing EDs from 0.292 to 1.707 was imaged under each of these conditions. The linear relationship between MV CBCT pixel value and ED was demonstrated for all MU settings and over the range of EDs. Changes in MU number did not dramatically alter the MV CBCT ED calibration. The use of different reconstruction filters was found to affect the MV CBCT ED calibration, as was the addition of solid water surrounding the phantom. Dose distributions from treatment plans calculated with simulated image data from a 15 MU head and neck reconstruction filter MV CBCT image and a MV CBCT ED calibration curve from the image data parameters and a 15 MU pelvis reconstruction filter showed small and clinically insignificant differences. Thus, the use of a single MV CBCT ED calibration curve is unlikely to result in any clinical differences. However, to ensure minimal uncertainties in dose reporting, MV CBCT ED calibration measurements could be carried out using parameter-specific calibration measurements.
Resumo:
In this paper, we present WebPut, a prototype system that adopts a novel web-based approach to the data imputation problem. Towards this, Webput utilizes the available information in an incomplete database in conjunction with the data consistency principle. Moreover, WebPut extends effective Information Extraction (IE) methods for the purpose of formulating web search queries that are capable of effectively retrieving missing values with high accuracy. WebPut employs a confidence-based scheme that efficiently leverages our suite of data imputation queries to automatically select the most effective imputation query for each missing value. A greedy iterative algorithm is also proposed to schedule the imputation order of the different missing values in a database, and in turn the issuing of their corresponding imputation queries, for improving the accuracy and efficiency of WebPut. Experiments based on several real-world data collections demonstrate that WebPut outperforms existing approaches.
Resumo:
Since 2004,China has set up more than 400 Confucius Institutes and 500 Confucius Classrooms in 108 countries to promote Chinese language and culture. Despite these impressive numbers, these institutions are still surprisingly under-studied. This article uses Confucius Institutes in Australia as a case study to deepen the understanding of China’s new cultural diplomacy tool. The article describes Confucius Institutes as a form of strategic stakeholder engagement and argues that this collaborative tool of cultural diplomacy depends heavily on the commitment of its local stakeholders.