347 resultados para machine tool
Resumo:
An application that translates raw thermal melt curve data into more easily assimilated knowledge is described. This program, called ‘Meltdown’, performs a number of data remediation steps before classifying melt curves and estimating melting temperatures. The final output is a report that summarizes the results of a differential scanning fluorimetry experiment. Meltdown uses a Bayesian classification scheme, enabling reproducible identification of various trends commonly found in DSF datasets. The goal of Meltdown is not to replace human analysis of the raw data, but to provide a sensible interpretation of the data to make this useful experimental technique accessible to naïve users, as well as providing a starting point for detailed analyses by more experienced users.
Resumo:
Corporate executives require relevant and intelligent business information in real-time to take strategic decisions. They require the freedom to access this information anywhere and anytime. There is a need to extend this functionality beyond the office and on the fingertips of the decision makers. Mobile Business Intelligence Tool (MBIT) aims to provide these features in a flexible and cost-efficient manner. This paper describes the detailed architecture of MBIT to overcome the limitations of existing mobile business intelligence tools. Further, a detailed implementation framework is presented to realize the design. This research highlights the benefits of using service oriented architecture to design flexible and platform independent mobile business applications. © 2009 IEEE.
Resumo:
Context: Pheochromocytomas and paragangliomas (PPGLs) are heritable neoplasms that can be classified into gene-expression subtypes corresponding to their underlying specific genetic drivers. Objective: This study aimed to develop a diagnostic and research tool (Pheo-type) capable of classifying PPGL tumors into gene-expression subtypes that could be used to guide and interpret genetic testing, determine surveillance programs, and aid in elucidation of PPGL biology. Design: A compendium of published microarray data representing 205 PPGL tumors was used for the selection of subtype-specific genes that were then translated to the Nanostring gene-expression platform. A support vector machine was trained on the microarray dataset and then tested on an independent Nanostring dataset representing 38 familial and sporadic cases of PPGL of known genotype (RET, NF1, TMEM127, MAX, HRAS, VHL, and SDHx). Different classifier models involving between three and six subtypes were compared for their discrimination potential. Results: A gene set of 46 genes and six endogenous controls was selected representing six known PPGL subtypes; RTK1–3 (RET, NF1, TMEM127, and HRAS), MAX-like, VHL, and SDHx. Of 38 test cases, 34 (90%) were correctly predicted to six subtypes based on the known genotype to gene-expression subtype association. Removal of the RTK2 subtype from training, characterized by an admixture of tumor and normal adrenal cortex, improved the classification accuracy (35/38). Consolidation of RTK and pseudohypoxic PPGL subtypes to four- and then three-class architectures improved the classification accuracy for clinical application. Conclusions: The Pheo-type gene-expression assay is a reliable method for predicting PPGL genotype using routine diagnostic tumor samples.
Resumo:
Identifying unusual or anomalous patterns in an underlying dataset is an important but challenging task in many applications. The focus of the unsupervised anomaly detection literature has mostly been on vectorised data. However, many applications are more naturally described using higher-order tensor representations. Approaches that vectorise tensorial data can destroy the structural information encoded in the high-dimensional space, and lead to the problem of the curse of dimensionality. In this paper we present the first unsupervised tensorial anomaly detection method, along with a randomised version of our method. Our anomaly detection method, the One-class Support Tensor Machine (1STM), is a generalisation of conventional one-class Support Vector Machines to higher-order spaces. 1STM preserves the multiway structure of tensor data, while achieving significant improvement in accuracy and efficiency over conventional vectorised methods. We then leverage the theory of nonlinear random projections to propose the Randomised 1STM (R1STM). Our empirical analysis on several real and synthetic datasets shows that our R1STM algorithm delivers comparable or better accuracy to a state-of-the-art deep learning method and traditional kernelised approaches for anomaly detection, while being approximately 100 times faster in training and testing.
Resumo:
It has been said that we are living in a golden age of innovation. New products, systems and services aimed to enable a better future, have emerged from novel interconnections between design and design research with science, technology and the arts. These intersections are now, more than ever, catalysts that enrich daily activities for health and safety, education, personal computing, entertainment and sustainability, to name a few. Interactive functions made possible by new materials, technology, and emerging manufacturing solutions demonstrate an ongoing interplay between cross-disciplinary knowledge and research. Such interactive interplay bring up questions concerning: (i) how art and design provide a focus for developing design solutions and research in technology; (ii) how theories emerging from the interactions of cross-disciplinary knowledge inform both the practice and research of design and (iii) how research and design work together in a mutually beneficial way. The IASDR2015 INTERPLAY EXHIBITION provides some examples of these interconnections of design research with science, technology and the arts. This is done through the presentation of objects, artefacts and demonstrations that are contextualised into everyday activities across various areas including health, education, safety, furniture, fashion and wearable design. The exhibits provide a setting to explore the various ways in which design research interacts across discipline knowledge and approaches to stimulate innovation. In education, Designing South African Children’s Health Education as Generative Play (A Bennett, F Cassim, M van der Merwe, K van Zijil, and M Ribbens) presents a set of toolkits that resulted from design research entailing generative play. The toolkits are systems that engender pleasure and responsibility, and are aimed at cultivating South African’s youth awareness of nutrition, hygiene, disease awareness and prevention, and social health. In safety, AVAnav: Avalanche Rescue Helmet (Jason Germany) delivers an interactive system as a tool to contribute to reduce the time to locate buried avalanche victims. Helmet-mounted this system responds to the contextual needs of rescuers and has since led to further design research on the interface design of rescuing devices. In apparel design and manufacturing, Shrinking Violets: Fashion design for disassembly (Alice Payne) proposes a design for disassembly through the use of beautiful reversible mono-material garments that interactively responds to the challenges of garment construction in the fashion industry, capturing the metaphor for the interplay between technology and craft in the fashion manufacturing industry. Harvest: A biotextile future (Dean Brough and Alice Payne), explores the interplay of biotechnology, materiality and textile design in the creation of sustainable, biodegradable vegan textile through the process of a symbiotic culture of bacteria and yeast (SCOBY). SCOBY is a pellicle curd that can be harvested, machine washed, dried and cut into a variety of designs and texture combinations. The exploration of smart materials, wearable design and micro-electronics led to creative and aesthetically coherent stimulus-reactive jewellery; Symbiotic Microcosms: Crafting Digital Interaction (K Vones). This creation aims to bridge the gap between craft practitioner and scientific discovery, proposing a move towards the notion of a post-human body, where wearable design is seen as potential ground for new human-computer interactions, affording the development of visually engaging multifunctional enhancements. In furniture design, Smart Assistive chair for older adults (Chao Zhao) demonstrates how cross-disciplinary knowledge interacting with design strategies provide solution that employed new technological developments in older aged care, and the participation of multiple stakeholders: designers, health care system and community based health systems. In health, Molecular diagnosis system for newborns deafness genetic screening (Chao Zhao) presents an ambitious and complex project that includes a medical device aimed at resolving a number of challenges: technical feasibility for city and rural contexts, compatibility with standard laboratory and hospital systems, access to health system, and support the work of different hospital specialists. The interplay between cross-disciplines is evident in this work, demonstrating how design research moves forward through technology developments. These works exemplify the intersection between domains as a means to innovation. Novel design problems are identified as design intersects with the various areas. Research informs this process, and in different ways. We see the background investigation into the contextualising domain (e.g. on-snow studies, garment recycling, South African health concerns, the post human body) to identify gaps in the area and design criteria; the technologies and materials reviews (e.g. AR, biotextiles) to offer plausible technical means to solve these, as well as design criteria. Theoretical reviews can also inform the design (e.g. play, flow). These work together to equip the design practitioner with a robust set of ‘tools’ for design innovation – tools that are based in research. The process identifies innovative opportunity and criteria for design and this, in turn, provides a means for evaluating the success of the design outcomes. Such an approach has the potential to come full circle between research and design – where the design can function as an exemplar, evidencing how the research-articulated problems can be solved. Core to this, however, is the evaluation of the design outcome itself and identifying knowledge outcomes. In some cases, this is fairly straightforward that is, easily measurable. For example the efficacy of Jason Germany’s helmet can be determined by measuring the reduced response time in the rescuer. Similarly the improved ability to recycle Payne’s panel garments can be clearly determined by comparing it to those recycling processes (and her identified criteria of separating textile elements!); while the sustainability and durability of the Brough & Payne’s biotextile can be assessed by documenting the growth and decay processes, or comparative strength studies. There are however situations where knowledge outcomes and insights are not so easily determined. Many of the works here are open-ended in their nature, as they emphasise the holistic experience of one or more designs, in context: “the end result of the art activity that provides the health benefit or outcome but rather, the value lies in the delivery and experience of the activity” (Bennet et al.) Similarly, reconfiguring layers of laser cut silk in Payne’s Shrinking Violets constitutes a customisable, creative process of clothing oneself since it “could be layered to create multiple visual effects”. Symbiotic Microcosms also has room for facilitating experience, as the work is described to facilitate “serendipitous discovery”. These examples show the diverse emphasis of enquiry as on the experience versus the product. Open-ended experiences are ambiguous, multifaceted and differ from person to person and moment to moment (Eco 1962). Determining the success is not always clear or immediately discernible; it may also not be the most useful question to ask. Rather, research that seeks to understand the nature of the experience afforded by the artefact is most useful in these situations. It can inform the design practitioner by helping them with subsequent re-design as well as potentially being generalizable to other designers and design contexts. Bennett et. al exemplify how this may be approached from a theoretical perspective. This work is concerned with facilitating engaging experiences to educate and, ultimately impact on that community. The research is concerned with the nature of that experience as well, and in order to do so the authors have employed theoretical lenses – here these are of flow, pleasure, play. An alternative or complementary approach to using theory, is using qualitative studies such as interviews with users to ask them about what they experienced? Here the user insights become evidence for generalising across, potentially revealing insight into relevant concerns – such as the range of possible ‘playful’ or experiences that may be afforded, or the situation that preceded a ‘serendipitous discovery’. As shown, IASDR2015 INTERPLAY EXHIBITION provides a platform for exploration, discussion and interrogation around the interplay of design research across diverse domains. We look forward with excitement as IASDR continues to bring research and design together, and as our communities of practitioners continue to push the envelope of what is design and how this can be expanded and better understood with research to foster new work and ultimately, stimulate innovation.
Resumo:
This paper presents a flexible and integrated planning tool for active distribution network to maximise the benefits of having high level s of renewables, customer engagement, and new technology implementations. The tool has two main processing parts: “optimisation” and “forecast”. The “optimization” part is an automated and integrated planning framework to optimize the net present value (NPV) of investment strategy for electric distribution network augmentation over large areas and long planning horizons (e.g. 5 to 20 years) based on a modified particle swarm optimization (MPSO). The “forecast” is a flexible agent-based framework to produce load duration curves (LDCs) of load forecasts for different levels of customer engagement, energy storage controls, and electric vehicles (EVs). In addition, “forecast” connects the existing databases of utility to the proposed tool as well as outputs the load profiles and network plan in Google Earth. This integrated tool enables different divisions within a utility to analyze their programs and options in a single platform using comprehensive information.
Resumo:
Recent technical advances have enabled for the first time, reliable in vitro culture of prostate cancer samples as prostate cancer organoids. This breakthrough provides the significant possibility of high throughput drug screening covering the spectrum of prostate cancer phenotypes seen clinically. These advances will enable precision medicine to become a reality, allowing patient samples to be screened for effective therapeutics ex vivo, with tailoring of treatments specific to that individual. This will hopefully lead to enhanced clinical outcomes, avoid morbidity due to ineffective therapies and improve the quality of life in men with advanced prostate cancer.
Resumo:
Virtual Machine (VM) management is an obvious need in today's data centers for various management activities and is accomplished in two phases— finding an optimal VM placement plan and implementing that placement through live VM migrations. These phases result in two research problems— VM placement problem (VMPP) and VM migration scheduling problem (VMMSP). This research proposes and develops several evolutionary algorithms and heuristic algorithms to address the VMPP and VMMSP. Experimental results show the effectiveness and scalability of the proposed algorithms. Finally, a VM management framework has been proposed and developed to automate the VM management activity in cost-efficient way.
Resumo:
People with disabilities (PWD) experience difficulties in accessing the transport system (including both infrastructure and services) to meet their needs for health care, employment and other activities. Our research shows that lack of access to the journeys needed for these purposes is a more significant barrier in low and middle income countries than in high income countries, and results in inadequate health care, rehabilitation and access to education and employment. At the same time, the existing transport system in low and middle income countries presents much higher road crash risks than in high income countries. By combining the principles and methods of Road Safety Audit and disability access, and adapting these Western approaches to a low/middle income country context, we have worked with Handicap International Cambodia to develop a Journey Access Tool (JAT) for use by disabled peoples’ organisations (DPOs), people with a disability and other key stakeholders. A key element of the approach is that it involves the participation of PWD on the journeys that they need to take, and it identifies infrastructure and service improvements that should be prioritised in order to facilitate access to these journeys. The JAT has been piloted in Cambodia with a range of PWD. This presentation will outline the design of the JAT and the results of the pilot studies. The information gained thus far strongly suggests that the JAT is a valuable and cost-effective approach that can be used by DPOs and professionals to identify barriers to access and prioritise the steps needed to address them.
Resumo:
The New Zealand White rabbit has been widely used as a model of limbal stem cell deficiency (LSCD). Current techniques for experimental induction of LSCD utilize caustic chemicals, or organic solvents applied in conjunction with a surgical limbectomy. While generally successful in depleting epithelial progenitors, the depth and severity of injury is difficult to control using chemical-based methods. Moreover, the anterior chamber can be easily perforated while surgically excising the corneal limbus. In the interest of creating a safer and more defined LSCD model, we have therefore evaluated a mechanical debridement technique based upon use of the AlgerBrush II rotating burr. An initial comparison of debridement techniques was conducted in situ using 24 eyes in freshly acquired New Zealand White rabbit cadavers. Techniques for comparison (4 eyes each) included: (1) non-wounded control, (2) surgical limbectomy followed by treatment with 100% (v/v) n-heptanol to remove the corneal epithelium (1-2 minutes), (3) treatment of both limbus and cornea with n-heptanol alone, (4) treatment of both limbus and cornea with 20% (v/v) ethanol (2-3 minutes), (5) a 2.5-mm rounded burr applied to both the limbus and cornea, and (6) a 1-mm pointed burr applied to the limbus, followed by the 2.5-mm rounded burr applied to the cornea. All corneas were excised and processed for histology immediately following debridement. A panel of four assessors subsequently scored the degree of epithelial debridement within the cornea and limbus using masked slides. The 2.5-mm burr most consistently removed the corneal and limbal epithelia. Islands of limbal epithelial cells were occasionally retained following surgical limbectomy/heptanol treatment, or use of the 1-mm burr. Limbal epithelial cells were consistently retained following treatment with either ethanol or n-heptanol alone, with ethanol being the least effective treatment overall. The 2.5-mm burr method was subsequently evaluated in the right eye of 3 live rabbits by weekly clinical assessments (photography and slit lamp examination) for up to 5 weeks, followed by histological analyses (hematoxylin & eosin stain, periodic acid-Schiff stain and immunohistochemistry for keratin 3 and 13). All 3 eyes that had been completely debrided using the 2.5-mm burr displayed symptoms of ocular surface failure as defined by retention of a prominent epithelial defect (~40% of corneal surface at 5 weeks), corneal neovascularization (2 to 3 quadrants), reduced corneal transparency and conjunctivalization of the corneal surface (demonstrated by the presence of goblet cells and/or staining for keratin 13). In conclusion, our findings indicate that the AlgerBrush II rotating burr is an effective method for the establishment of ocular surface failure in New Zealand White rabbits. In particular, we recommend use of the 2.5-mm rotating burr for improved efficiency of epithelial debridement and safety compared to surgical limbectomy.
Resumo:
Elucidating the structure and dynamics of lamellipodia and filopodia in response to different stimuli is a topic of continuing interest in cancer cells as these structures may be attractive targets for therapeutic purposes. Interestingly, a close functional relationship between these actin-rich protrusions and specialized membrane domains has been recently demonstrated. The aim of this study was therefore to investigate the fine organization of these actin-rich structures and examine how they structurally may relate to detergent-resistant membrane (DRM) domains in the MTLn3 EGF/serum starvation model. For this reason, we designed a straightforward and alternative method to study cytoskeleton arrays and their associated structures by means of correlative fluorescence (/laser)- and electron microscopy (CFEM). CFEM on whole mounted breast cancer cells revealed that a lamellipodium is composed of an intricate filamentous actin web organized in various patterns after different treatments. Both actin dots and DRM's were resolved, and were closely interconnected with the surrounding cytoskeleton. Long actin filaments were repeatedly observed extending beyond the leading edge and their density and length varied after different treatments. Furthermore, CFEM also allowed us to demonstrate the close structural association of DRMs with the cytoskeleton in general and the filamentous/dot-like structural complexes in particular, suggesting that they are all functionally linked and consequently may regulate the cell's fingertip dynamics. Finally, electron tomographic modelling on the same CFEM samples confirmed that these extensions are clearly embedded within the cytoskeletal matrix of the lamellipodium.
Resumo:
Purpose In the oncology population where malnutrition prevalence is high, more descriptive screening tools can provide further information to assist triaging and capture acute change. The Patient-Generated Subjective Global Assessment Short Form (PG-SGA SF) is a component of a nutritional assessment tool which could be used for descriptive nutrition screening. The purpose of this study was to conduct a secondary analysis of nutrition screening and assessment data to identify the most relevant information contributing to the PG-SGA SF to identify malnutrition risk with high sensitivity and specificity. Methods This was an observational, cross-sectional study of 300 consecutive adult patients receiving ambulatory anti-cancer treatment at an Australian tertiary hospital. Anthropometric and patient descriptive data were collected. The scored PG-SGA generated a score for nutritional risk (PG-SGA SF) and a global rating for nutrition status. Receiver operating characteristic curves (ROC) were generated to determine optimal cut-off scores for combinations of the PG-SGA SF boxes with the greatest sensitivity and specificity for predicting malnutrition according to scored PG-SGA global rating. Results The additive scores of boxes 1–3 had the highest sensitivity (90.2 %) while maintaining satisfactory specificity (67.5 %) and demonstrating high diagnostic value (AUC = 0.85, 95 % CI = 0.81–0.89). The inclusion of box 4 (PG-SGA SF) did not add further value as a screening tool (AUC = 0.85, 95 % CI = 0.80–0.89; sensitivity 80.4 %; specificity 72.3 %). Conclusions The validity of the PG-SGA SF in chemotherapy outpatients was confirmed. The present study however demonstrated that the functional capacity question (box 4) does not improve the overall discriminatory value of the PG-SGA SF.
Resumo:
Agricultural pests are responsible for millions of dollars in crop losses and management costs every year. In order to implement optimal site-specific treatments and reduce control costs, new methods to accurately monitor and assess pest damage need to be investigated. In this paper we explore the combination of unmanned aerial vehicles (UAV), remote sensing and machine learning techniques as a promising technology to address this challenge. The deployment of UAVs as a sensor platform is a rapidly growing field of study for biosecurity and precision agriculture applications. In this experiment, a data collection campaign is performed over a sorghum crop severely damaged by white grubs (Coleoptera: Scarabaeidae). The larvae of these scarab beetles feed on the roots of plants, which in turn impairs root exploration of the soil profile. In the field, crop health status could be classified according to three levels: bare soil where plants were decimated, transition zones of reduced plant density and healthy canopy areas. In this study, we describe the UAV platform deployed to collect high-resolution RGB imagery as well as the image processing pipeline implemented to create an orthoimage. An unsupervised machine learning approach is formulated in order to create a meaningful partition of the image into each of the crop levels. The aim of the approach is to simplify the image analysis step by minimizing user input requirements and avoiding the manual data labeling necessary in supervised learning approaches. The implemented algorithm is based on the K-means clustering algorithm. In order to control high-frequency components present in the feature space, a neighbourhood-oriented parameter is introduced by applying Gaussian convolution kernels prior to K-means. The outcome of this approach is a soft K-means algorithm similar to the EM algorithm for Gaussian mixture models. The results show the algorithm delivers decision boundaries that consistently classify the field into three clusters, one for each crop health level. The methodology presented in this paper represents a venue for further research towards automated crop damage assessments and biosecurity surveillance.
Resumo:
This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.
Resumo:
Computational modelling of mechanisms underlying processes in the real world can be of great value in understanding complex biological behaviours. Uptake in general biology and ecology has been rapid. However, it often requires specific data sets that are overly costly in time and resources to collect. The aim of the current study was to test whether a generic behavioural ecology model constructed using published data could give realistic outputs for individual species. An individual-based model was developed using the Pattern-Oriented Modelling (POM) strategy and protocol, based on behavioural rules associated with insect movement choices. Frugivorous Tephritidae (fruit flies) were chosen because of economic significance in global agriculture and the multiple published data sets available for a range of species. The Queensland fruit fly (Qfly), Bactrocera tryoni, was identified as a suitable individual species for testing. Plant canopies with modified architecture were used to run predictive simulations. A field study was then conducted to validate our model predictions on how plant architecture affects fruit flies’ behaviours. Characteristics of plant architecture such as different shapes, e.g., closed-canopy and vase-shaped, affected fly movement patterns and time spent on host fruit. The number of visits to host fruit also differed between the edge and centre in closed-canopy plants. Compared to plant architecture, host fruit has less contribution to effects on flies’ movement patterns. The results from this model, combined with our field study and published empirical data suggest that placing fly traps in the upper canopy at the edge should work best. Such a modelling approach allows rapid testing of ideas about organismal interactions with environmental substrates in silico rather than in vivo, to generate new perspectives. Using published data provides a saving in time and resources. Adjustments for specific questions can be achieved by refinement of parameters based on targeted experiments.