27 resultados para Lab-On-a-Chip(LOC)
Resumo:
In this paper, a hardware-based path planning architecture for unmanned aerial vehicle (UAV) adaptation is proposed. The architecture aims to provide UAVs with higher autonomy using an application specific evolutionary algorithm (EA) implemented entirely on a field programmable gate array (FPGA) chip. The physical attributes of an FPGA chip, being compact in size and low in power consumption, compliments it to be an ideal platform for UAV applications. The design, which is implemented entirely in hardware, consists of EA modules, population storage resources, and three-dimensional terrain information necessary to the path planning process, subject to constraints accounted for separately via UAV, environment and mission profiles. The architecture has been successfully synthesised for a target Xilinx Virtex-4 FPGA platform with 32% logic slices utilisation. Results obtained from case studies for a small UAV helicopter with environment derived from LIDAR (Light Detection and Ranging) data verify the effectiveness of the proposed FPGA-based path planner, and demonstrate convergence at rates above the typical 10 Hz update frequency of an autopilot system.
Resumo:
The Australian e-Health Research Centre (AEHRC) recently participated in the ShARe/CLEF eHealth Evaluation Lab Task 1. The goal of this task is to individuate mentions of disorders in free-text electronic health records and map disorders to SNOMED CT concepts in the UMLS metathesaurus. This paper details our participation to this ShARe/CLEF task. Our approaches are based on using the clinical natural language processing tool Metamap and Conditional Random Fields (CRF) to individuate mentions of disorders and then to map those to SNOMED CT concepts. Empirical results obtained on the 2013 ShARe/CLEF task highlight that our instance of Metamap (after ltering irrelevant semantic types), although achieving a high level of precision, is only able to identify a small amount of disorders (about 21% to 28%) from free-text health records. On the other hand, the addition of the CRF models allows for a much higher recall (57% to 79%) of disorders from free-text, without sensible detriment in precision. When evaluating the accuracy of the mapping of disorders to SNOMED CT concepts in the UMLS, we observe that the mapping obtained by our ltered instance of Metamap delivers state-of-the-art e ectiveness if only spans individuated by our system are considered (`relaxed' accuracy).
Resumo:
Discharge summaries and other free-text reports in healthcare transfer information between working shifts and geographic locations. Patients are likely to have difficulties in understanding their content, because of their medical jargon, non-standard abbreviations,and ward-specific idioms. This paper reports on an evaluation lab with an aim to support the continuum of care by developing methods and resources that make clinical reports in English easier to understand for patients, and which helps them in finding information related to their condition.
Resumo:
This paper presents the results of task 3 of the ShARe/CLEF eHealth Evaluation Lab 2013. This evaluation lab focuses on improving access to medical information on the web. The task objective was to investigate the effect of using additional information such as the discharge summaries and external resources such as medical ontologies on the IR effectiveness. The participants were allowed to submit up to seven runs, one mandatory run using no additional information or external resources, and three each using or not using discharge summaries.
Resumo:
RATIONALE: Polymer-based surface coatings in outdoor applications experience accelerated degradation due to exposure to solar radiation, oxygen and atmospheric pollutants. These deleterious agents cause undesirable changes to the aesthetic and mechanical properties of the polymer, reducing its lifetime. The use of antioxidants such as hindered amine light stabilisers (HALS) retards these degradative processes; however, mechanisms for HALS action and polymer degradation are poorly understood. METHODS: Detection of the HALS TINUVINW123 (bis(1-octyloxy-2,2,6,6-tetramethyl-4-piperidyl) sebacate) and the polymer degradation products directly from a polyester-based coil coating was achieved by liquid extraction surface analysis (LESA) coupled to a triple quadrupole QTRAPW 5500 mass spectrometer. The detection of TINUVINW123 and melamine was confirmed by the characteristic fragmentation pattern observed in LESA-MS/MS spectra that was identical to that reported for authentic samples. RESULTS: Analysis of an unstabilised coil coating by LESA-MS after exposure to 4 years of outdoor field testing revealed the presence of melamine (1,3,5-triazine-2,4,6-triamine) as a polymer degradation product at elevated levels. Changes to the physical appearance of the coil coating, including powder-like deposits on the coating's surface, were observed to coincide with melamine deposits and are indicative of the phenomenon known as polymer ' blooming'. CONCLUSIONS: For the first time, in situ detection of analytes from a thermoset polymer coating was accomplished without any sample preparation, providing advantages over traditional extraction-analysis approaches and some contemporary ambient MS methods. Detection of HALS and polymer degradation products such as melamine provides insight into the mechanisms by which degradation occurs and suggests LESA-MS is a powerful new tool for polymer analysis. Copyright (C) 2012 John Wiley & Sons, Ltd.
Resumo:
This paper reports on the 2nd ShARe/CLEFeHealth evaluation lab which continues our evaluation resource building activities for the medical domain. In this lab we focus on patients' information needs as opposed to the more common campaign focus of the specialised information needs of physicians and other healthcare workers. The usage scenario of the lab is to ease patients and next-of-kins' ease in understanding eHealth information, in particular clinical reports. The 1st ShARe/CLEFeHealth evaluation lab was held in 2013. This lab consisted of three tasks. Task 1 focused on named entity recognition and normalization of disorders; Task 2 on normalization of acronyms/abbreviations; and Task 3 on information retrieval to address questions patients may have when reading clinical reports. This year's lab introduces a new challenge in Task 1 on visual-interactive search and exploration of eHealth data. Its aim is to help patients (or their next-of-kin) in readability issues related to their hospital discharge documents and related information search on the Internet. Task 2 then continues the information extraction work of the 2013 lab, specifically focusing on disorder attribute identification and normalization from clinical text. Finally, this year's Task 3 further extends the 2013 information retrieval task, by cleaning the 2013 document collection and introducing a new query generation method and multilingual queries. De-identified clinical reports used by the three tasks were from US intensive care and originated from the MIMIC II database. Other text documents for Tasks 1 and 3 were from the Internet and originated from the Khresmoi project. Task 2 annotations originated from the ShARe annotations. For Tasks 1 and 3, new annotations, queries, and relevance assessments were created. 50, 79, and 91 people registered their interest in Tasks 1, 2, and 3, respectively. 24 unique teams participated with 1, 10, and 14 teams in Tasks 1, 2 and 3, respectively. The teams were from Africa, Asia, Canada, Europe, and North America. The Task 1 submission, reviewed by 5 expert peers, related to the task evaluation category of Effective use of interaction and targeted the needs of both expert and novice users. The best system had an Accuracy of 0.868 in Task 2a, an F1-score of 0.576 in Task 2b, and Precision at 10 (P@10) of 0.756 in Task 3. The results demonstrate the substantial community interest and capabilities of these systems in making clinical reports easier to understand for patients. The organisers have made data and tools available for future research and development.
Resumo:
The Urban Informatics Research Lab brings together a group of people who focus their research on interdisciplinary topics at the intersection of social, spatial, and technical research domains—that is, people, place, and technology. Those topics are spread across the breadth of urban life—its contemporary issues and its needs, as well as the design opportunities that we have as individuals, groups, communities, and as a whole society. The lab’s current research areas include urban planning and design, civic innovation, mobility and transportation, education and connected learning, environmental sustainability, and food and urban agriculture. The common denominator of the lab’s approach is user-centered design research directed toward understanding, conceptualizing, developing, and evaluating sociotechnical practices as well as the opportunities afforded by innovative digital technology in urban environments.
Resumo:
Current developments in gene medicine and vaccination studies are utilizing plasmid DNA (pDNA) as the vector. For this reason, there has been an increasing trend towards larger and larger doses of pDNA utilized in human trials: from 100-1000 μg in 2002 to 500-5000 μg in 2005. The increasing demand of pDNA has created the need to revolutionalize current production levels under optimum economy. In this work, different standard media (LB, TB and SOC) for culturing recombinant Escherichia coli DH5α harbouring pUC19 were compared to a medium optimised for pDNA production. Lab scale fermentations using the standard media showed that the highest pDNA volumetric and specific yields were for TB (11.4 μg/ml and 6.3 μg/mg dry cell mass respectively) and the lowest was for LB (2.8 μg/ml and 3.3 μg/mg dry cell mass respectively). A fourth medium, PDMR, designed by modifying a stoichiometrically-formulated medium with an optimised carbon source concentration and carbon to nitrogen ratio displayed pDNA volumetric and specific yields of 23.8 μg/ml and 11.2 μg/mg dry cell mass respectively. However, it is the economic advantages of the optimised medium that makes it so attractive. Keeping all variables constant except medium and using LB as a base scenario (100 medium cost [MC] units/mg pDNA), the optimised PDMR medium yielded pDNA at a cost of only 27 MC units/mg pDNA. These results show that greater amounts of pDNA can be obtained more economically with minimal extra effort simply by using a medium optimised for pDNA production.
Resumo:
Multidrug resistance (MDR) occurs in prostate cancer, and this happens when the cancer cells resist chemotherapeutic drugs by pumping them out of the cells. MDR inhibitors such as cyclosporin A (CsA) can stop the pumping and enhance the drugs accumulated in the cells. The cellular drug accumulation is monitored using a microfluidic chip mounted on a single cell bioanalyzer. This equipment has been developed to measure accumulation of drugs such as doxorubicin (DOX) and fluorescently labeled paclitaxel (PTX) in single prostate cancer cells. The inhibition of drug efflux on the same prostate cell was examined in drug-sensitive and drug-resistant cells. Accumulation of these drug molecules was not found in the MDR cells, PC-3 RX-DT2R cells. Enhanced drug accumulation was observed only after treating the MDR cell in the presence of 5 μM of CsA as the MDR inhibitor. We envision this monitoring of the accumulation of fluorescent molecules (drug or fluorescent molecules), if conducted on single patient cancer cells, can provide information for clinical monitoring of patients undergoing chemotherapy in the future.
Resumo:
In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.
Resumo:
Circulating tumor cells (CTCs) are found in the blood of patients with cancer. Although these cells are rare, they can provide useful information for chemotherapy. However, isolation of these rare cells from blood is technically challenging because they are small in numbers. An integrated microfluidic chip, dubbed as CTC chip, was designed and fabricated for conducting tumor cell isolation. As CTCs usually show multidrug resistance (MDR), the effect of MDR inhibitors on chemotherapeutic drug accumulation in the isolated single tumor cell is measured. As a model of CTC isolation, human prostate tumor cells were mixed with mouse blood cells and the labelfree isolation of the tumor cells was conducted based on cell size difference. The major advantages of the CTC chip are the ability for fast cell isolation, followed by multiple rounds of single-cell measurements, suggesting a potential assay for detecting the drug responses based on the liquid biopsy of cancer patients.
Resumo:
Digital and interactive technologies are becoming increasingly embedded in everyday lives of people around the world. Application of technologies such as real-time, context-aware, and interactive technologies; augmented and immersive realities; social media; and location-based services has been particularly evident in urban environments where technological and sociocultural infrastructures enable easier deployment and adoption as compared to non-urban areas. There has been growing consumer demand for new forms of experiences and services enabled through these emerging technologies. We call this ambient media, as the media is embedded in the natural human living environment. This workshop focuses on ambient media services, applications, and technologies that promote people’s engagement in creating and recreating liveliness in urban environments, particularly through arts, culture, and gastronomic experiences. The RelCi workshop series is organized in cooperation with the Queensland University of Technology (QUT), in particular the Urban Informatics Lab and the Tampere University of Technology (TUT), in particular the Entertainment and Media Management (EMMi) Lab. The workshop runs under the umbrella of the International Ambient Media Association (AMEA) (http://www.ambientmediaassociation.org), which is hosting the international open access journal entitled “International Journal on Information Systems and Management in Creative eMedia”, and the international open access series “International Series on Information Systems and Management in Creative eMedia” (see http://www.tut.fi/emmi/Journal). The RelCi workshop took place for the first time in 2012 in conjunction with ICME 2012 in Melbourne, Autralia; and this year’s edition took place in conjunction with INTERACT 2013 in Cape Town, South Africa. Besides, the International Ambient Media Association (AMEA) organizes the Semantic Ambient Media (SAME) workshop series, which took place in 2008 in conjunction with ACM Multimedia 2008 in Vancouver, Canada; in 2009 in conjunction with AmI 2009 in Salzburg, Austria; in 2010 in conjunction with AmI 2010 in Malaga, Spain; in 2011 in conjunction with Communities and Technologies 2011 in Brisbane, Australia; in 2012 in conjunction with Pervasive 2012 in Newcastle, UK; and in 2013 in conjunction with C&T 2013 in Munich, Germany.