921 resultados para SPREADABLE PROCESSED CHEESE
Resumo:
A Monte Carlo model of an Elekta iViewGT amorphous silicon electronic portal imaging device (a-Si EPID) has been validated for pre-treatment verification of clinical IMRT treatment plans. The simulations involved the use of the BEAMnrc and DOSXYZnrc Monte Carlo codes to predict the response of the iViewGT a-Si EPID model. The predicted EPID images were compared to the measured images obtained from the experiment. The measured EPID images were obtained by delivering a photon beam from an Elekta Synergy linac to the Elekta iViewGT a-Si EPID. The a-Si EPID was used with no additional build-up material. Frame averaged EPID images were acquired and processed using in-house software. The agreement between the predicted and measured images was analyzed using the gamma analysis technique with acceptance criteria of 3% / 3 mm. The results show that the predicted EPID images for four clinical IMRT treatment plans have a good agreement with the measured EPID signal. Three prostate IMRT plans were found to have an average gamma pass rate of more than 95.0 % and a spinal IMRT plan has the average gamma pass rate of 94.3 %. During the period of performing this work a routine MLC calibration was performed and one of the IMRT treatments re-measured with the EPID. A change in the gamma pass rate for one field was observed. This was the motivation for a series of experiments to investigate the sensitivity of the method by introducing delivery errors, MLC position and dosimetric overshoot, into the simulated EPID images. The method was found to be sensitive to 1 mm leaf position errors and 10% overshoot errors.
Resumo:
Dirt collected with sugarcane is processed and separated from the juice in the sugar factory by filtration equipment for return to the cane fields. New technologies over the past decade have enabled performance improvements to be obtained for this key unit operation. Filter mud product still contains a reasonable amount of sugar and the transportation of high moisture mud product has considerable cost. Australia’s traditional approach has been to use Rotary Vacuum Filters for processing and separating mud and other impurities from juice, but in recent years there has been interest in reducing sugar losses and transportation costs through utilisation of new technologies such as Horizontal Bed Filters, Vacuum Belt Press Filters, Membrane Press Filters and Centrifuges. Increasingly, these alternative equipment are being installed in new factories. This chapter describes the general principles of mud filtration theory and mud conditioning followed by a detailed description and review of the various filtration technologies and analysis of the relative merits associated with the equipment.
Resumo:
STAC is a mobile application (app) designed to promote the benefits of climate-aware urban development in Subtropical environments. Although, STAC is primarily tool for understanding climate efficient buildings in Brisbane, Australia, it also demonstrates how other exemplary buildings operate in other subtropical cities of the world. The STAC research and development team applied research undertaken by the Centre for Subtropical Design (Brisbane) to profile buildings past and present that have contributed to the creation of a vibrant society, a viable economy, a healthy environment, and an authentic sense of place. In collaboration with researchers from the field of Interaction Design, this knowledge and data was collated, processed and curated for presentation via a custom mobile application designed to distribute this important research for review and consideration on-location in local settings and for comparison across all other global subtropical regions and projects identified by this research. This collaboration adopted a Design-based Research (DBR) Methodology guided by the main tenets of research and design iteration and cross-discipline collaboration in real-world settings, resulting in the formulation of contextually-sensitive design principles, theories, and tools for design intervention. Combined with significant context review of available technology and data and subsequent case study analysis of exemplar design applications.
Resumo:
Aim The assessment of treatment plans is an important component in the education of radiation therapists. The establishment of a grade for a plan is currently based on subjective assessment of a range of criteria. The automation of assessment could provide a number of advantages including faster feedback, reduced chance of human error, and simpler aggregation of past results. Method A collection of treatments planned by a cohort of 27 second year radiation therapy students were selected for quantitative evaluation. Treatment sites included the bladder, cervix, larynx, parotid and prostate, although only the larynx plans had been assessed in detail. The plans were designed with the Pinnacle system and exported using the DICOM framework. Assessment criteria included beam arrangement optimisation, volume contouring, target dose coverage and homogeneity, and organ-at-risk sparing. The in-house Treatment and Dose Assessor (TADA) software1 was evaluated for suitability in assisting with the quantitative assessment of these plans. Dose volume data were exported in per-student and per-structure data tables, along with beam complexity metrics, dose volume histograms, and reports on naming conventions. Results The treatment plans were exported and processed using TADA, with the processing of all 27 plans for each treatment site taking less than two minutes. Naming conventions were successfully checked against a reference protocol. Significant variations between student plans were found. Correlation with assessment feedback was established for the larynx plans. Conclusion The data generated could be used to inform the selection of future assessment criteria, monitor student development, and provide useful feedback to the students. The provision of objective, quantitative evaluations of plan quality would be a valuable addition to not only radiotherapy education programmes but also for staff development and potentially credentialing methods. New functionality within TADA developed for this work could be applied clinically to, for example, evaluate protocol compliance.
Resumo:
Use of the hand is vital in working life due to the grabbing and pinching it performs. Spherical grip is the most commonly used, due to similarity to the gripping of a computer mouse. Knowledge of its execution and the involved elements is essential. Analysis of this exertion with surface electromyography devices (to register muscular activity) and accelerometer devices (to register movement values ) can provide multiple variables. Six subjects performed ball gripping and registered real-time electromyography (thenar region, hypothenar region, first dorsal interosseous, flexors of the wrist, flexor carpi ulnaris and extensors of the wrist muscles) and accelerometer (thumb, index, middle, ring, pinky and palm) values. The obtained data was resampled “R software” and processed “Matlab Script” based on an automatic numerical sequence recognition program. Electromyography values were normalized on the basis of maximum voluntary contraction, whilst modular values were calculated for the acceleration vector. After processing and analysing the obtained data and signal, it was possible to identify five stages of movement in accordance with the module vector from the palm. The statistical analysis of the variables was descriptive: average and standard deviations. The outcome variables focus on the variations of the modules of the vector (between the maximum and minimum values of each module and phase) and the maximum values of the standardized electromyography of each muscle. Analysis of movement through accelerometer and electromyography variables can give us an insight into the operation of spherical grip. The protocol and treatment data can be used as a system to complement existing assessments in the hand.
Resumo:
(Equation Presented). A series of star-shaped organic semiconductors have been synthesized from 1,3,6,8-tetrabromopyrene. The materials are soluble in common organic solvents allowing for solution processing of devices such as light-emitting diodes (OLEDs). One of the materials, 1,3,6,8-tetrakis(4- butoxyphenyl)pyrene, has been used as the active emitting layer in simple solution-processed OLEDs with deep blue emission (CIE = 0.15, 0.18) and maximum efficiencies and brightness levels of 2.56 cd/A and >5000 cd/m2, respectively.
Resumo:
BACKGROUND: The use of nonstandardized N-terminal pro-B-type natriuretic peptide (NT-proBNP) assays can contribute to the misdiagnosis of heart failure (HF). Moreover, there is yet to be established a common consensus regarding the circulating forms of NT-proBNP being used in current assays. We aimed to characterize and quantify the various forms of NT-proBNP in the circulation of HF patients. METHODS: Plasma samples were collected from HF patients (n = 20) at rest and stored at -80 degrees C. NT-proBNP was enriched from HF patient plasma by use of immunoprecipitation followed by mass spectrometric analysis. Customized homogeneous sandwich AlphaLISA (R) immunoassays were developed and validated to quantify 6 fragments of NT-proBNP. RESULTS: Mass spectrometry identified the presence of several N- and C-terminally processed forms of circulating NT-proBNP, with physiological proteolysis between Pro2-Leu3, Leu3-Gly4, Pro6-Gly7, and Pro75-Arg76. Consistent with this result, AlphaLISA immunoassays demonstrated that antibodies targeting the extreme N or C termini measured a low apparent concentration of circulating NT-proBNP. The apparent circulating NT-proBNP concentration was increased with antibodies targeting nonglycosylated and nonterminal epitopes (P < 0.05). CONCLUSIONS: In plasma collected from HF patients, immunoreactive NT-proBNP was present as multiple N- and C-terminally truncated fragments of the full length NT-proBNP molecule. Immunodetection of NT-proBNP was significantly improved with the use of antibodies that did not target these terminal regions. These findings support the development of a next generation NT-proBNP assay targeting nonterminal epitopes as well as avoiding the central glycosylated region of this molecule. (c) 2013 American Association for Clinical Chemistry
Resumo:
• Premise of the study: Here we propose a staining protocol using TBO and Ruthenium red in order to reliably identify secondary compounds in the leaves of some species of Myrtaceae. • Methods and results: Leaves of 10 species representing 10 different genera of Myrtaceae were processed and stained using five different combinations of Ruthenium red and TBO. Optimal staining conditions were determined as 1 min of Ruthenium red (0.05% aqueous) and 45 sec of TBO (0.1% aqueous). Secondary compounds clearly identified under this treatment include mucilage in mesophyll, polyphenols in cuticle, lignin in fibers and xylem, tannins and carboxylated polysaccharides in epidermis and pectic substances in primary cell walls. • Conclusions: Potential applications of this protocol include systematic, phytochemical and ecological investigations in Myrtaceae. It might be applicable to other plant families rich in secondary compounds and could be used as preliminary screening method for extraction of these elements.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
This thesis studied cadmium sulfide and cadmium selenide quantum dots and their performance as light absorbers in quantum dot-sensitised solar cells. This research has made contributions to the understanding of size dependent photodegradation, passivation and particle growth mechanism of cadmium sulfide quantum dots using SILAR method and the role of ZnSe shell coatings on solar cell performance improvement.
Resumo:
While the neural regions associated with facial identity recognition are considered to be well defined, the neural correlates of non-moving and moving images of facial emotion processing are less clear. This study examined the brain electrical activity changes in 26 participants (14 males M = 21.64, SD = 3.99; 12 females M = 24.42, SD = 4.36), during a passive face viewing task, a scrambled face task and separate emotion and gender face discrimination tasks. The steady state visual evoked potential (SSVEP) was recorded from 64-electrode sites. Consistent with previous research, face related activity was evidenced at scalp regions over the parieto-temporal region approximately 170 ms after stimulus presentation. Results also identified different SSVEP spatio-temporal changes associated with the processing of static and dynamic facial emotions with respect to gender, with static stimuli predominately associated with an increase in inhibitory processing within the frontal region. Dynamic facial emotions were associated with changes in SSVEP response within the temporal region, which are proposed to index inhibitory processing. It is suggested that static images represent non-canonical stimuli which are processed via different mechanisms to their more ecologically valid dynamic counterparts.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
π-Conjugated polymers are the most promising semiconductor materials to enable printed organic thin film transistors (OTFTs) due to their excellent solution processability and mechanical robustness. However, solution-processed polymer semiconductors have shown poor charge transport properties mainly originated from the disordered polymer chain packing in the solid state as compared to the thermally evaporated small molecular organic semiconductors. The low charge carrier mobility, typically < 0.1 cm2 /V.s, of polymer semiconductors poses a challenge for most intended applications such as displays and radio-frequency identification (RFID) tags. Here we present our recent results on the dike topyrrolopyrrole (DPP)-based polymers and demonstrate that when DPP is combined with appropriate electron donating moieties such as thiophene and thienothiophene, very high charge carrier mobility values of ~1 cm2/V.s could be achieved.
Resumo:
Collections of biological specimens are fundamental to scientific understanding and characterization of natural diversity - past, present and future. This paper presents a system for liberating useful information from physical collections by bringing specimens into the digital domain so they can be more readily shared, analyzed, annotated and compared. It focuses on insects and is strongly motivated by the desire to accelerate and augment current practices in insect taxonomy which predominantly use text, 2D diagrams and images to describe and characterize species. While these traditional kinds of descriptions are informative and useful, they cannot cover insect specimens "from all angles" and precious specimens are still exchanged between researchers and collections for this reason. Furthermore, insects can be complex in structure and pose many challenges to computer vision systems. We present a new prototype for a practical, cost-effective system of off-the-shelf components to acquire natural-colour 3D models of insects from around 3 mm to 30 mm in length. ("Natural-colour" is used to contrast with "false-colour", i.e., colour generated from, or applied to, gray-scale data post-acquisition.) Colour images are captured from different angles and focal depths using a digital single lens reflex (DSLR) camera rig and two-axis turntable. These 2D images are processed into 3D reconstructions using software based on a visual hull algorithm. The resulting models are compact (around 10 megabytes), afford excellent optical resolution, and can be readily embedded into documents and web pages, as well as viewed on mobile devices. The system is portable, safe, relatively affordable, and complements the sort of volumetric data that can be acquired by computed tomography. This system provides a new way to augment the description and documentation of insect species holotypes, reducing the need to handle or ship specimens. It opens up new opportunities to collect data for research, education, art, entertainment, biodiversity assessment and biosecurity control. © 2014 Nguyen et al.