171 resultados para SPREADABLE PROCESSED CHEESE
Resumo:
Aim The assessment of treatment plans is an important component in the education of radiation therapists. The establishment of a grade for a plan is currently based on subjective assessment of a range of criteria. The automation of assessment could provide a number of advantages including faster feedback, reduced chance of human error, and simpler aggregation of past results. Method A collection of treatments planned by a cohort of 27 second year radiation therapy students were selected for quantitative evaluation. Treatment sites included the bladder, cervix, larynx, parotid and prostate, although only the larynx plans had been assessed in detail. The plans were designed with the Pinnacle system and exported using the DICOM framework. Assessment criteria included beam arrangement optimisation, volume contouring, target dose coverage and homogeneity, and organ-at-risk sparing. The in-house Treatment and Dose Assessor (TADA) software1 was evaluated for suitability in assisting with the quantitative assessment of these plans. Dose volume data were exported in per-student and per-structure data tables, along with beam complexity metrics, dose volume histograms, and reports on naming conventions. Results The treatment plans were exported and processed using TADA, with the processing of all 27 plans for each treatment site taking less than two minutes. Naming conventions were successfully checked against a reference protocol. Significant variations between student plans were found. Correlation with assessment feedback was established for the larynx plans. Conclusion The data generated could be used to inform the selection of future assessment criteria, monitor student development, and provide useful feedback to the students. The provision of objective, quantitative evaluations of plan quality would be a valuable addition to not only radiotherapy education programmes but also for staff development and potentially credentialing methods. New functionality within TADA developed for this work could be applied clinically to, for example, evaluate protocol compliance.
Resumo:
Use of the hand is vital in working life due to the grabbing and pinching it performs. Spherical grip is the most commonly used, due to similarity to the gripping of a computer mouse. Knowledge of its execution and the involved elements is essential. Analysis of this exertion with surface electromyography devices (to register muscular activity) and accelerometer devices (to register movement values ) can provide multiple variables. Six subjects performed ball gripping and registered real-time electromyography (thenar region, hypothenar region, first dorsal interosseous, flexors of the wrist, flexor carpi ulnaris and extensors of the wrist muscles) and accelerometer (thumb, index, middle, ring, pinky and palm) values. The obtained data was resampled “R software” and processed “Matlab Script” based on an automatic numerical sequence recognition program. Electromyography values were normalized on the basis of maximum voluntary contraction, whilst modular values were calculated for the acceleration vector. After processing and analysing the obtained data and signal, it was possible to identify five stages of movement in accordance with the module vector from the palm. The statistical analysis of the variables was descriptive: average and standard deviations. The outcome variables focus on the variations of the modules of the vector (between the maximum and minimum values of each module and phase) and the maximum values of the standardized electromyography of each muscle. Analysis of movement through accelerometer and electromyography variables can give us an insight into the operation of spherical grip. The protocol and treatment data can be used as a system to complement existing assessments in the hand.
Resumo:
(Equation Presented). A series of star-shaped organic semiconductors have been synthesized from 1,3,6,8-tetrabromopyrene. The materials are soluble in common organic solvents allowing for solution processing of devices such as light-emitting diodes (OLEDs). One of the materials, 1,3,6,8-tetrakis(4- butoxyphenyl)pyrene, has been used as the active emitting layer in simple solution-processed OLEDs with deep blue emission (CIE = 0.15, 0.18) and maximum efficiencies and brightness levels of 2.56 cd/A and >5000 cd/m2, respectively.
Resumo:
BACKGROUND: The use of nonstandardized N-terminal pro-B-type natriuretic peptide (NT-proBNP) assays can contribute to the misdiagnosis of heart failure (HF). Moreover, there is yet to be established a common consensus regarding the circulating forms of NT-proBNP being used in current assays. We aimed to characterize and quantify the various forms of NT-proBNP in the circulation of HF patients. METHODS: Plasma samples were collected from HF patients (n = 20) at rest and stored at -80 degrees C. NT-proBNP was enriched from HF patient plasma by use of immunoprecipitation followed by mass spectrometric analysis. Customized homogeneous sandwich AlphaLISA (R) immunoassays were developed and validated to quantify 6 fragments of NT-proBNP. RESULTS: Mass spectrometry identified the presence of several N- and C-terminally processed forms of circulating NT-proBNP, with physiological proteolysis between Pro2-Leu3, Leu3-Gly4, Pro6-Gly7, and Pro75-Arg76. Consistent with this result, AlphaLISA immunoassays demonstrated that antibodies targeting the extreme N or C termini measured a low apparent concentration of circulating NT-proBNP. The apparent circulating NT-proBNP concentration was increased with antibodies targeting nonglycosylated and nonterminal epitopes (P < 0.05). CONCLUSIONS: In plasma collected from HF patients, immunoreactive NT-proBNP was present as multiple N- and C-terminally truncated fragments of the full length NT-proBNP molecule. Immunodetection of NT-proBNP was significantly improved with the use of antibodies that did not target these terminal regions. These findings support the development of a next generation NT-proBNP assay targeting nonterminal epitopes as well as avoiding the central glycosylated region of this molecule. (c) 2013 American Association for Clinical Chemistry
Resumo:
• Premise of the study: Here we propose a staining protocol using TBO and Ruthenium red in order to reliably identify secondary compounds in the leaves of some species of Myrtaceae. • Methods and results: Leaves of 10 species representing 10 different genera of Myrtaceae were processed and stained using five different combinations of Ruthenium red and TBO. Optimal staining conditions were determined as 1 min of Ruthenium red (0.05% aqueous) and 45 sec of TBO (0.1% aqueous). Secondary compounds clearly identified under this treatment include mucilage in mesophyll, polyphenols in cuticle, lignin in fibers and xylem, tannins and carboxylated polysaccharides in epidermis and pectic substances in primary cell walls. • Conclusions: Potential applications of this protocol include systematic, phytochemical and ecological investigations in Myrtaceae. It might be applicable to other plant families rich in secondary compounds and could be used as preliminary screening method for extraction of these elements.
Resumo:
Description of a patient's injuries is recorded in narrative text form by hospital emergency departments. For statistical reporting, this text data needs to be mapped to pre-defined codes. Existing research in this field uses the Naïve Bayes probabilistic method to build classifiers for mapping. In this paper, we focus on providing guidance on the selection of a classification method. We build a number of classifiers belonging to different classification families such as decision tree, probabilistic, neural networks, and instance-based, ensemble-based and kernel-based linear classifiers. An extensive pre-processing is carried out to ensure the quality of data and, in hence, the quality classification outcome. The records with a null entry in injury description are removed. The misspelling correction process is carried out by finding and replacing the misspelt word with a soundlike word. Meaningful phrases have been identified and kept, instead of removing the part of phrase as a stop word. The abbreviations appearing in many forms of entry are manually identified and only one form of abbreviations is used. Clustering is utilised to discriminate between non-frequent and frequent terms. This process reduced the number of text features dramatically from about 28,000 to 5000. The medical narrative text injury dataset, under consideration, is composed of many short documents. The data can be characterized as high-dimensional and sparse, i.e., few features are irrelevant but features are correlated with one another. Therefore, Matrix factorization techniques such as Singular Value Decomposition (SVD) and Non Negative Matrix Factorization (NNMF) have been used to map the processed feature space to a lower-dimensional feature space. Classifiers with these reduced feature space have been built. In experiments, a set of tests are conducted to reflect which classification method is best for the medical text classification. The Non Negative Matrix Factorization with Support Vector Machine method can achieve 93% precision which is higher than all the tested traditional classifiers. We also found that TF/IDF weighting which works well for long text classification is inferior to binary weighting in short document classification. Another finding is that the Top-n terms should be removed in consultation with medical experts, as it affects the classification performance.
Resumo:
This thesis studied cadmium sulfide and cadmium selenide quantum dots and their performance as light absorbers in quantum dot-sensitised solar cells. This research has made contributions to the understanding of size dependent photodegradation, passivation and particle growth mechanism of cadmium sulfide quantum dots using SILAR method and the role of ZnSe shell coatings on solar cell performance improvement.
Resumo:
While the neural regions associated with facial identity recognition are considered to be well defined, the neural correlates of non-moving and moving images of facial emotion processing are less clear. This study examined the brain electrical activity changes in 26 participants (14 males M = 21.64, SD = 3.99; 12 females M = 24.42, SD = 4.36), during a passive face viewing task, a scrambled face task and separate emotion and gender face discrimination tasks. The steady state visual evoked potential (SSVEP) was recorded from 64-electrode sites. Consistent with previous research, face related activity was evidenced at scalp regions over the parieto-temporal region approximately 170 ms after stimulus presentation. Results also identified different SSVEP spatio-temporal changes associated with the processing of static and dynamic facial emotions with respect to gender, with static stimuli predominately associated with an increase in inhibitory processing within the frontal region. Dynamic facial emotions were associated with changes in SSVEP response within the temporal region, which are proposed to index inhibitory processing. It is suggested that static images represent non-canonical stimuli which are processed via different mechanisms to their more ecologically valid dynamic counterparts.
Resumo:
As a key element in their response to new media forcing transformations in mass media and media use, newspapers have deployed various strategies to not only establish online and mobile products, and develop healthy business plans, but to set out to be dominant portals. Their response to change was the subject of an early investigation by one of the present authors (Keshvani 2000). That was part of a set of short studies inquiring into what impact new software applications and digital convergence might have on journalism practice (Tickle and Keshvani 2000), and also looking for demonstrations of the way that innovations, technologies and protocols then under development might produce a “wireless, streamlined electronic news production process (Tickle and Keshvani 2001).” The newspaper study compared the online products of The Age in Melbourne and the Straits Times in Singapore. It provided an audit of the Singapore and Australia Information and Communications Technology (ICT) climate concentrating on the state of development of carrier networks, as a determining factor in the potential strength of the two services with their respective markets. In the outcome, contrary to initial expectations, the early cable roll-out and extensive ‘wiring’ of the city in Singapore had not produced a level of uptake of Internet services as strong as that achieved in Melbourne by more ad hoc and varied strategies. By interpretation, while news websites and online content were at an early stage of development everywhere, and much the same as one another, no determining structural imbalance existed to separate these leading media participants in Australia and South-east Asia. The present research revisits that situation, by again studying the online editions of the two large newspapers in the original study, and one other, The Courier Mail, (recognising the diversification of types of product in this field, by including it as a representative of Newscorp, now a major participant). The inquiry works through the principle of comparison. It is an exercise in qualitative, empirical research that establishes a comparison between the situation in 2000 as described in the earlier work, and the situation in 2014, after a decade of intense development in digital technology affecting the media industries. It is in that sense a follow-up study on the earlier work, although this time giving emphasis to content and style of the actual products as experienced by their users. It compares the online and print editions of each of these three newspapers; then the three mastheads as print and online entities, among themselves; and finally it compares one against the other two, as representing a South-east Asian model and Australian models. This exercise is accompanied by a review of literature on the developments in ICT affecting media production and media organisations, to establish the changed context. The new study of the online editions is conducted as a systematic appraisal of the first level, or principal screens, of the three publications, over the course of six days (10-15.2.14 inclusive). For this, categories for analysis were made, through conducting a preliminary examination of the products over three days in the week before. That process identified significant elements of media production, such as: variegated sourcing of materials; randomness in the presentation of items; differential production values among media platforms considered, whether text, video or stills images; the occasional repurposing and repackaging of top news stories of the day and the presence of standard news values – once again drawn out of the trial ‘bundle’ of journalistic items. Reduced in this way the online artefacts become comparable with the companion print editions from the same days. The categories devised and then used in the appraisal of the online products have been adapted to print, to give the closest match of sets of variables. This device, to study the two sets of publications on like standards -- essentially production values and news values—has enabled the comparisons to be made. This comparing of the online and print editions of each of the three publications was set up as up the first step in the investigation. In recognition of the nature of the artefacts, as ones that carry very diverse information by subject and level of depth, and involve heavy creative investment in the formulation and presentation of the information; the assessment also includes an open section for interpreting and commenting on main points of comparison. This takes the form of a field for text, for the insertion of notes, in the table employed for summarising the features of each product, for each day. When the sets of comparisons as outlined above are noted, the process then becomes interpretative, guided by the notion of change. In the context of changing media technology and publication processes, what substantive alterations have taken place, in the overall effort of news organisations in the print and online fields since 2001; and in their print and online products separately? Have they diverged or continued along similar lines? The remaining task is to begin to make inferences from that. Will the examination of findings enforce the proposition that a review of the earlier study, and a forensic review of new models, does provide evidence of the character and content of change --especially change in journalistic products and practice? Will it permit an authoritative description on of the essentials of such change in products and practice? Will it permit generalisation, and provide a reliable base for discussion of the implications of change, and future prospects? Preliminary observations suggest a more dynamic and diversified product has been developed in Singapore, well themed, obviously sustained by public commitment and habituation to diversified online and mobile media services. The Australian products suggest a concentrated corporate and journalistic effort and deployment of resources, with a strong market focus, but less settled and ordered, and showing signs of limitations imposed by the delay in establishing a uniform, large broadband network. The scope of the study is limited. It is intended to test, and take advantage of the original study as evidentiary material from the early days of newspaper companies’ experimentation with online formats. Both are small studies. The key opportunity for discovery lies in the ‘time capsule’ factor; the availability of well-gathered and processed information on major newspaper company production, at the threshold of a transformational decade of change in their industry. The comparison stands to identify key changes. It should also be useful as a reference for further inquiries of the same kind that might be made, and for monitoring of the situation in regard to newspaper portals on line, into the future.
Resumo:
π-Conjugated polymers are the most promising semiconductor materials to enable printed organic thin film transistors (OTFTs) due to their excellent solution processability and mechanical robustness. However, solution-processed polymer semiconductors have shown poor charge transport properties mainly originated from the disordered polymer chain packing in the solid state as compared to the thermally evaporated small molecular organic semiconductors. The low charge carrier mobility, typically < 0.1 cm2 /V.s, of polymer semiconductors poses a challenge for most intended applications such as displays and radio-frequency identification (RFID) tags. Here we present our recent results on the dike topyrrolopyrrole (DPP)-based polymers and demonstrate that when DPP is combined with appropriate electron donating moieties such as thiophene and thienothiophene, very high charge carrier mobility values of ~1 cm2/V.s could be achieved.
Resumo:
Collections of biological specimens are fundamental to scientific understanding and characterization of natural diversity - past, present and future. This paper presents a system for liberating useful information from physical collections by bringing specimens into the digital domain so they can be more readily shared, analyzed, annotated and compared. It focuses on insects and is strongly motivated by the desire to accelerate and augment current practices in insect taxonomy which predominantly use text, 2D diagrams and images to describe and characterize species. While these traditional kinds of descriptions are informative and useful, they cannot cover insect specimens "from all angles" and precious specimens are still exchanged between researchers and collections for this reason. Furthermore, insects can be complex in structure and pose many challenges to computer vision systems. We present a new prototype for a practical, cost-effective system of off-the-shelf components to acquire natural-colour 3D models of insects from around 3 mm to 30 mm in length. ("Natural-colour" is used to contrast with "false-colour", i.e., colour generated from, or applied to, gray-scale data post-acquisition.) Colour images are captured from different angles and focal depths using a digital single lens reflex (DSLR) camera rig and two-axis turntable. These 2D images are processed into 3D reconstructions using software based on a visual hull algorithm. The resulting models are compact (around 10 megabytes), afford excellent optical resolution, and can be readily embedded into documents and web pages, as well as viewed on mobile devices. The system is portable, safe, relatively affordable, and complements the sort of volumetric data that can be acquired by computed tomography. This system provides a new way to augment the description and documentation of insect species holotypes, reducing the need to handle or ship specimens. It opens up new opportunities to collect data for research, education, art, entertainment, biodiversity assessment and biosecurity control. © 2014 Nguyen et al.
Resumo:
We analyse the security of iterated hash functions that compute an input dependent checksum which is processed as part of the hash computation. We show that a large class of such schemes, including those using non-linear or even one-way checksum functions, is not secure against the second preimage attack of Kelsey and Schneier, the herding attack of Kelsey and Kohno and the multicollision attack of Joux. Our attacks also apply to a large class of cascaded hash functions. Our second preimage attacks on the cascaded hash functions improve the results of Joux presented at Crypto’04. We also apply our attacks to the MD2 and GOST hash functions. Our second preimage attacks on the MD2 and GOST hash functions improve the previous best known short-cut second preimage attacks on these hash functions by factors of at least 226 and 254, respectively. Our herding and multicollision attacks on the hash functions based on generic checksum functions (e.g., one-way) are a special case of the attacks on the cascaded iterated hash functions previously analysed by Dunkelman and Preneel and are not better than their attacks. On hash functions with easily invertible checksums, our multicollision and herding attacks (if the hash value is short as in MD2) are more efficient than those of Dunkelman and Preneel.
Resumo:
We propose a new way to build a combined list from K base lists, each containing N items. A combined list consists of top segments of various sizes from each base list so that the total size of all top segments equals N. A sequence of item requests is processed and the goal is to minimize the total number of misses. That is, we seek to build a combined list that contains all the frequently requested items. We first consider the special case of disjoint base lists. There, we design an efficient algorithm that computes the best combined list for a given sequence of requests. In addition, we develop a randomized online algorithm whose expected number of misses is close to that of the best combined list chosen in hindsight. We prove lower bounds that show that the expected number of misses of our randomized algorithm is close to the optimum. In the presence of duplicate items, we show that computing the best combined list is NP-hard. We show that our algorithms still apply to a linearized notion of loss in this case. We expect that this new way of aggregating lists will find many ranking applications.
Resumo:
During Pavlovian auditory fear conditioning a previously neutral auditory stimulus (CS) gains emotional significance through pairing with a noxious unconditioned stimulus (US). These associations are believed to be formed by way of plasticity at auditory input synapses on principal neurons in the lateral nucleus of the amygdala (LA). In order to begin to understand how fear memories are stored and processed by synaptic changes in the LA, we have quantified both the entire neural number and the sub-cellular structure of LA principal neurons.We first used stereological cell counting methods on Gimsa or GABA immunostained rat brain. We identified 60,322+/-1408 neurons in the LA unilaterally (n=7). Of these 16,917+/-471 were GABA positive. The intercalated nuclei were excluded from the counts and thus GABA cells are believed to represent GABAergic interneurons. The sub-nuclei of the LA were also independently counted. We then quantified the morphometric properties of in vitro electrophysiologically identified principal neurons of the LA, corrected for shrinkage in xyz planes. The total dendritic length was 9.97+/-2.57mm, with 21+/-4 nodes (n=6). Dendritic spine density was 0.19+/-0.03 spines/um (n=6). Intra-LA axon collaterals had a bouton density of 0.1+/-0.02 boutons/um (n=5). These data begin to reveal the finite cellular and sub-cellular processing capacity of the lateral amygdala, and should facilitate efforts to understand mechanisms of plasticity in LA.