975 resultados para IoT platforms
Resumo:
BACKGROUND: The serum peptidome may be a valuable source of diagnostic cancer biomarkers. Previous mass spectrometry (MS) studies have suggested that groups of related peptides discriminatory for different cancer types are generated ex vivo from abundant serum proteins by tumor-specific exopeptidases. We tested 2 complementary serum profiling strategies to see if similar peptides could be found that discriminate ovarian cancer from benign cases and healthy controls. METHODS: We subjected identically collected and processed serum samples from healthy volunteers and patients to automated polypeptide extraction on octadecylsilane-coated magnetic beads and separately on ZipTips before MALDI-TOF MS profiling at 2 centers. The 2 platforms were compared and case control profiling data analyzed to find altered MS peak intensities. We tested models built from training datasets for both methods for their ability to classify a blinded test set. RESULTS: Both profiling platforms had CVs of approximately 15% and could be applied for high-throughput analysis of clinical samples. The 2 methods generated overlapping peptide profiles, with some differences in peak intensity in different mass regions. In cross-validation, models from training data gave diagnostic accuracies up to 87% for discriminating malignant ovarian cancer from healthy controls and up to 81% for discriminating malignant from benign samples. Diagnostic accuracies up to 71% (malignant vs healthy) and up to 65% (malignant vs benign) were obtained when the models were validated on the blinded test set. CONCLUSIONS: For ovarian cancer, altered MALDI-TOF MS peptide profiles alone cannot be used for accurate diagnoses.
Resumo:
In the Biodiversity World (BDW) project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to predict past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack interoperability. The BDW system brings all these disparate units together so that the user can combine tools with little thought as to their availability, data formats and interoperability. The current Web Servicesbased Grid environment enables execution of the BDW workflow tasks in remote nodes but with a limited scope. The next step in the evolution of the BDW architecture is to enable workflow tasks to utilise computational resources available within and outside the BDW domain. We describe the present BDW architecture and its transition to a new framework which provides a distributed computational environment for mapping and executing workflows in addition to bringing together heterogeneous resources and analytical tools.
Resumo:
The Java language first came to public attention in 1995. Within a year, it was being speculated that Java may be a good language for parallel and distributed computing. Its core features, including being objected oriented and platform independence, as well as having built-in network support and threads, has encouraged this view. Today, Java is being used in almost every type of computer-based system, ranging from sensor networks to high performance computing platforms, and from enterprise applications through to complex research-based.simulations. In this paper the key features that make Java a good language for parallel and distributed computing are first discussed. Two Java-based middleware systems, namely MPJ Express, an MPI-like Java messaging system, and Tycho, a wide-area asynchronous messaging framework with an integrated virtual registry are then discussed. The paper concludes by highlighting the advantages of using Java as middleware to support distributed applications.
Resumo:
Traditionally, applications and tools supporting collaborative computing have been designed only with personal computers in mind and support a limited range of computing and network platforms. These applications are therefore not well equipped to deal with network heterogeneity and, in particular, do not cope well with dynamic network topologies. Progress in this area must be made if we are to fulfil the needs of users and support the diversity, mobility, and portability that are likely to characterise group work in future. This paper describes a groupware platform called Coco that is designed to support collaboration in a heterogeneous network environment. The work demonstrates that progress in the p development of a generic supporting groupware is achievable, even in the context of heterogeneous and dynamic networks. The work demonstrates the progress made in the development of an underlying communications infrastructure, building on peer-to-peer concept and topologies to improve scalability and robustness.
Resumo:
Increasingly, distributed systems are being used to host all manner of applications. While these platforms provide a relatively cheap and effective means of executing applications, so far there has been little work in developing tools and utilities that can help application developers understand problems with the supporting software, or the executing applications. To fully understand why an application executing on a distributed system is not behaving as would be expected it is important that not only the application, but also the underlying middleware, and the operating system are analysed too, otherwise issues could be missed and certainly overall performance profiling and fault diagnoses would be harder to understand. We believe that one approach to profiling and the analysis of distributed systems and the associated applications is via the plethora of log files generated at runtime. In this paper we report on a system (Slogger), that utilises various emerging Semantic Web technologies to gather the heterogeneous log files generated by the various layers in a distributed system and unify them in common data store. Once unified, the log data can be queried and visualised in order to highlight potential problems or issues that may be occurring in the supporting software or the application itself.
Resumo:
Technology-enhanced or Computer Aided Learning (e-learning) can be institutionally integrated and supported by learning management systems or Virtual Learning Environments (VLEs) to offer efficiency gains, effectiveness and scalability of the e-leaning paradigm. However this can only be achieved through integration of pedagogically intelligent approaches and lesson preparation tools environment and VLE that is well accepted by both the students and teachers. This paper critically explores some of the issues relevant to scalable routinisation of e-learning at the tertiary level, typically first year university undergraduates, with the teaching of Relational Data Analysis (RDA), as supported by multimedia authoring, as a case study. The paper concludes that blended learning approaches which balance the deployment of e-learning with other modalities of learning delivery such as instructor–mediated group learning etc offer the most flexible and scalable route to e-learning but that this requires the graceful integration of platforms for multimedia production, distribution and delivery through advanced interactive spaces that provoke learner engagement and promote learning autonomy and group learning facilitated by a cooperative-creative learning environment that remains open to personal exploration of constructivist-constructionist pathways to learning.
Resumo:
In the decade since OceanObs `99, great advances have been made in the field of ocean data dissemination. The use of Internet technologies has transformed the landscape: users can now find, evaluate and access data rapidly and securely using only a web browser. This paper describes the current state of the art in dissemination methods for ocean data, focussing particularly on ocean observations from in situ and remote sensing platforms. We discuss current efforts being made to improve the consistency of delivered data and to increase the potential for automated integration of diverse datasets. An important recent development is the adoption of open standards from the Geographic Information Systems community; we discuss the current impact of these new technologies and their future potential. We conclude that new approaches will indeed be necessary to exchange data more effectively and forge links between communities, but these approaches must be evaluated critically through practical tests, and existing ocean data exchange technologies must be used to their best advantage. Investment in key technology components, cross-community pilot projects and the enhancement of end-user software tools will be required in order to assess and demonstrate the value of any new technology.
Resumo:
With the transition to multicore processors almost complete, the parallel processing community is seeking efficient ways to port legacy message passing applications on shared memory and multicore processors. MPJ Express is our reference implementation of Message Passing Interface (MPI)-like bindings for the Java language. Starting with the current release, the MPJ Express software can be configured in two modes: the multicore and the cluster mode. In the multicore mode, parallel Java applications execute on shared memory or multicore processors. In the cluster mode, Java applications parallelized using MPJ Express can be executed on distributed memory platforms like compute clusters and clouds. The multicore device has been implemented using Java threads in order to satisfy two main design goals of portability and performance. We also discuss the challenges of integrating the multicore device in the MPJ Express software. This turned out to be a challenging task because the parallel application executes in a single JVM in the multicore mode. On the contrary in the cluster mode, the parallel user application executes in multiple JVMs. Due to these inherent architectural differences between the two modes, the MPJ Express runtime is modified to ensure correct semantics of the parallel program. Towards the end, we compare performance of MPJ Express (multicore mode) with other C and Java message passing libraries---including mpiJava, MPJ/Ibis, MPICH2, MPJ Express (cluster mode)---on shared memory and multicore processors. We found out that MPJ Express performs signicantly better in the multicore mode than in the cluster mode. Not only this but the MPJ Express software also performs better in comparison to other Java messaging libraries including mpiJava and MPJ/Ibis when used in the multicore mode on shared memory or multicore processors. We also demonstrate effectiveness of the MPJ Express multicore device in Gadget-2, which is a massively parallel astrophysics N-body siimulation code.
Resumo:
The NERC UK SOLAS-funded Reactive Halogens in the Marine Boundary Layer (RHaMBLe) programme comprised three field experiments. This manuscript presents an overview of the measurements made within the two simultaneous remote experiments conducted in the tropical North Atlantic in May and June 2007. Measurements were made from two mobile and one ground-based platforms. The heavily instrumented cruise D319 on the RRS Discovery from Lisbon, Portugal to São Vicente, Cape Verde and back to Falmouth, UK was used to characterise the spatial distribution of boundary layer components likely to play a role in reactive halogen chemistry. Measurements onboard the ARSF Dornier aircraft were used to allow the observations to be interpreted in the context of their vertical distribution and to confirm the interpretation of atmospheric structure in the vicinity of the Cape Verde islands. Long-term ground-based measurements at the Cape Verde Atmospheric Observatory (CVAO) on São Vicente were supplemented by long-term measurements of reactive halogen species and characterisation of additional trace gas and aerosol species during the intensive experimental period. This paper presents a summary of the measurements made within the RHaMBLe remote experiments and discusses them in their meteorological and chemical context as determined from these three platforms and from additional meteorological analyses. Air always arrived at the CVAO from the North East with a range of air mass origins (European, Atlantic and North American continental). Trace gases were present at stable and fairly low concentrations with the exception of a slight increase in some anthropogenic components in air of North American origin, though NOx mixing ratios during this period remained below 20 pptv (note the non-IUPAC adoption in this manuscript of pptv and ppbv, equivalent to pmol mol−1 and nmol mol−1 to reflect common practice). Consistency with these air mass classifications is observed in the time series of soluble gas and aerosol composition measurements, with additional identification of periods of slightly elevated dust concentrations consistent with the trajectories passing over the African continent. The CVAO is shown to be broadly representative of the wider North Atlantic marine boundary layer; measurements of NO, O3 and black carbon from the ship are consistent with a clean Northern Hemisphere marine background. Aerosol composition measurements do not indicate elevated organic material associated with clean marine air. Closer to the African coast, black carbon and NO levels start to increase, indicating greater anthropogenic influence. Lower ozone in this region is possibly associated with the increased levels of measured halocarbons, associated with the nutrient rich waters of the Mauritanian upwelling. Bromide and chloride deficits in coarse mode aerosol at both the CVAO and on D319 and the continuous abundance of inorganic gaseous halogen species at CVAO indicate significant reactive cycling of halogens. Aircraft measurements of O3 and CO show that surface measurements are representative of the entire boundary layer in the vicinity both in diurnal variability and absolute levels. Above the inversion layer similar diurnal behaviour in O3 and CO is observed at lower mixing ratios in the air that had originated from south of Cape Verde, possibly from within the ITCZ. ECMWF calculations on two days indicate very different boundary layer depths and aircraft flights over the ship replicate this, giving confidence in the calculated boundary layer depth.
Resumo:
This paper presents a review of the design and development of the Yorick series of active stereo camera platforms and their integration into real-time closed loop active vision systems, whose applications span surveillance, navigation of autonomously guided vehicles (AGVs), and inspection tasks for teleoperation, including immersive visual telepresence. The mechatronic approach adopted for the design of the first system, including head/eye platform, local controller, vision engine, gaze controller and system integration, proved to be very successful. The design team comprised researchers with experience in parallel computing, robot control, mechanical design and machine vision. The success of the project has generated sufficient interest to sanction a number of revisions of the original head design, including the design of a lightweight compact head for use on a robot arm, and the further development of a robot head to look specifically at increasing visual resolution for visual telepresence. The controller and vision processing engines have also been upgraded, to include the control of robot heads on mobile platforms and control of vergence through tracking of an operator's eye movement. This paper details the hardware development of the different active vision/telepresence systems.
Resumo:
Mounted on the sides of two widely separated spacecraft, the two Heliospheric Imager (HI) instruments onboard NASA’s STEREO mission view, for the first time, the space between the Sun and Earth. These instruments are wide-angle visible-light imagers that incorporate sufficient baffling to eliminate scattered light to the extent that the passage of solar coronal mass ejections (CMEs) through the heliosphere can be detected. Each HI instrument comprises two cameras, HI-1 and HI-2, which have 20° and 70° fields of view and are off-pointed from the Sun direction by 14.0° and 53.7°, respectively, with their optical axes aligned in the ecliptic plane. This arrangement provides coverage over solar elongation angles from 4.0° to 88.7° at the viewpoints of the two spacecraft, thereby allowing the observation of Earth-directed CMEs along the Sun – Earth line to the vicinity of the Earth and beyond. Given the two separated platforms, this also presents the first opportunity to view the structure and evolution of CMEs in three dimensions. The STEREO spacecraft were launched from Cape Canaveral Air Force Base in late October 2006, and the HI instruments have been performing scientific observations since early 2007. The design, development, manufacture, and calibration of these unique instruments are reviewed in this paper. Mission operations, including the initial commissioning phase and the science operations phase, are described. Data processing and analysis procedures are briefly discussed, and ground-test results and in-orbit observations are used to demonstrate that the performance of the instruments meets the original scientific requirements.
Resumo:
Twenty first century challenges facing agriculture include climate change, threats to food security for a growing population and downward economic pressures on rural livelihoods. Addressing these challenges will require innovation in extension theory, policy and education, at a time when the dominance of the state in the provision of knowledge and information services to farmers and rural entrepreneurs continues to decline. This paper suggests that extension theory is catching up with and helping us to understand innovative extension practice, and therefore provides a platform for improving rural development policies and strategies. Innovation is now less likely to be spoken of as something to be passed on to farmers, than as a continuing process of creativity and adaptation that can be nurtured and sustained. Innovation systems and innovation platforms are concepts that recognise the multiple factors that lead to farmers’ developing, adapting and applying new ideas and the importance of linking all actors in the value chain to ensure producers can access appropriate information and advice for decision making at all stages in the production process. Concepts of social learning, group development and solidarity, social capital, collective action and empowerment all help to explain and therefore to apply more effectively group extension approaches in building confidence and sustaining innovation. A challenge facing educators is to ensure the curricula for aspiring extension professionals in our higher education institutions are regularly reviewed and keep up with current and future developments in theory, policy and practice.
Resumo:
International Perspective The development of GM technology continues to expand into increasing numbers of crops and conferred traits. Inevitably, the focus remains on the major field crops of soybean, maize, cotton, oilseed rape and potato with introduced genes conferring herbicide tolerance and/or pest resistance. Although there are comparatively few GM crops that have been commercialised to date, GM versions of 172 plant species have been grown in field trials in 31 countries. European Crops with Containment Issues Of the 20 main crops in the EU there are four for which GM varieties are commercially available (cotton, maize for animal feed and forage, and oilseed rape). Fourteen have GM varieties in field trials (bread wheat, barley, durum wheat, sunflower, oats, potatoes, sugar beet, grapes, alfalfa, olives, field peas, clover, apples, rice) and two have GM varieties still in development (rye, triticale). Many of these crops have hybridisation potential with wild and weedy relatives in the European flora (bread wheat, barley, oilseed rape, durum wheat, oats, sugar beet and grapes), with escapes (sunflower); and all have potential to cross-pollinate fields non-GM crops. Several fodder crops, forestry trees, grasses and ornamentals have varieties in field trials and these too may hybridise with wild relatives in the European flora (alfalfa, clover, lupin, silver birch, sweet chestnut, Norway spruce, Scots pine, poplar, elm, Agrostis canina, A. stolonifera, Festuca arundinacea, Lolium perenne, L. multiflorum, statice and rose). All these crops will require containment strategies to be in place if it is deemed necessary to prevent transgene movement to wild relatives and non-GM crops. Current Containment Strategies A wide variety of GM containment strategies are currently under development, with a particular focus on crops expressing pharmaceutical products. Physical containment in greenhouses and growth rooms is suitable for some crops (tomatoes, lettuce) and for research purposes. Aquatic bioreactors of some non-crop species (algae, moss, and duckweed) expressing pharmaceutical products have been adopted by some biotechnology companies. There are obvious limitations of the scale of physical containment strategies, addressed in part by the development of large underground facilities in the US and Canada. The additional resources required to grow plants underground incurs high costs that in the long term may negate any advantage of GM for commercial productioNatural genetic containment has been adopted by some companies through the selection of either non-food/feed crops (algae, moss, duckweed) as bio-pharming platforms or organisms with no wild relatives present in the local flora (safflower in the Americas). The expression of pharmaceutical products in leafy crops (tobacco, alfalfa, lettuce, spinach) enables growth and harvesting prior to and in the absence of flowering. Transgenically controlled containment strategies range in their approach and degree of development. Plastid transformation is relatively well developed but is not suited to all traits or crops and does not offer complete containment. Male sterility is well developed across a range of plants but has limitations in its application for fruit/seed bearing crops. It has been adopted in some commercial lines of oilseed rape despite not preventing escape via seed. Conditional lethality can be used to prevent flowering or seed development following the application of a chemical inducer, but requires 100% induction of the trait and sufficient application of the inducer to all plants. Equally, inducible expression of the GM trait requires equally stringent application conditions. Such a method will contain the trait but will allow the escape of a non-functioning transgene. Seed lethality (‘terminator’ technology) is the only strategy at present that prevents transgene movement via seed, but due to public opinion against the concept it has never been trialled in the field and is no longer under commercial development. Methods to control flowering and fruit development such as apomixis and cleistogamy will prevent crop-to-wild and wild-to-crop pollination, but in nature both of these strategies are complex and leaky. None of the genes controlling these traits have as yet been identified or characterised and therefore have not been transgenically introduced into crop species. Neither of these strategies will prevent transgene escape via seed and any feral apomicts that form are arguably more likely to become invasives. Transgene mitigation reduces the fitness of initial hybrids and so prevents stable introgression of transgenes into wild populations. However, it does not prevent initial formation of hybrids or spread to non-GM crops. Such strategies could be detrimental to wild populations and have not yet been demonstrated in the field. Similarly, auxotrophy prevents persistence of escapes and hybrids containing the transgene in an uncontrolled environment, but does not prevent transgene movement from the crop. Recoverable block of function, intein trans-splicing and transgene excision all use recombinases to modify the transgene in planta either to induce expression or to prevent it. All require optimal conditions and 100% accuracy to function and none have been tested under field conditions as yet. All will contain the GM trait but all will allow some non-native DNA to escape to wild populations or to non-GM crops. There are particular issues with GM trees and grasses as both are largely undomesticated, wind pollinated and perennial, thus providing many opportunities for hybridisation. Some species of both trees and grass are also capable of vegetative propagation without sexual reproduction. There are additional concerns regarding the weedy nature of many grass species and the long-term stability of GM traits across the life span of trees. Transgene stability and conferred sterility are difficult to trial in trees as most field trials are only conducted during the juvenile phase of tree growth. Bio-pharming of pharmaceutical and industrial compounds in plants Bio-pharming of pharmaceutical and industrial compounds in plants offers an attractive alternative to mammalian-based pharmaceutical and vaccine production. Several plantbased products are already on the market (Prodigene’s avidin, β-glucuronidase, trypsin generated in GM maize; Ventria’s lactoferrin generated in GM rice). Numerous products are in clinical trials (collagen, antibodies against tooth decay and non-Hodgkin’s lymphoma from tobacco; human gastric lipase, therapeutic enzymes, dietary supplements from maize; Hepatitis B and Norwalk virus vaccines from potato; rabies vaccines from spinach; dietary supplements from Arabidopsis). The initial production platforms for plant-based pharmaceuticals were selected from conventional crops, largely because an established knowledge base already existed. Tobacco and other leafy crops such as alfalfa, lettuce and spinach are widely used as leaves can be harvested and no flowering is required. Many of these crops can be grown in contained greenhouses. Potato is also widely used and can also be grown in contained conditions. The introduction of morphological markers may aid in the recognition and traceability of crops expressing pharmaceutical products. Plant cells or plant parts may be transformed and maintained in culture to produce recombinant products in a contained environment. Plant cells in suspension or in vitro, roots, root cells and guttation fluid from leaves may be engineered to secrete proteins that may be harvested in a continuous, non-destructive manner. Most strategies in this category remain developmental and have not been commercially adopted at present. Transient expression produces GM compounds from non-GM plants via the utilisation of bacterial or viral vectors. These vectors introduce the trait into specific tissues of whole plants or plant parts, but do not insert them into the heritable genome. There are some limitations of scale and the field release of such crops will require the regulation of the vector. However, several companies have several transiently expressed products in clinical and pre-clinical trials from crops raised in physical containment.
Resumo:
This paper will present a conceptual framework for the examination of land redevelopment based on a complex systems/networks approach. As Alvin Toffler insightfully noted, modern scientific enquiry has become exceptionally good at splitting problems into pieces but has forgotten how to put the pieces back together. Twenty-five years after his remarks, governments and corporations faced with the requirements of sustainability are struggling to promote an ‘integrated’ or ‘holistic’ approach to tackling problems. Despite the talk, both practice and research provide few platforms that allow for ‘joined up’ thinking and action. With socio-economic phenomena, such as land redevelopment, promising prospects open up when we assume that their constituents can make up complex systems whose emergent properties are more than the sum of the parts and whose behaviour is inherently difficult to predict. A review of previous research shows that it has mainly focused on idealised, ‘mechanical’ views of property development processes that fail to recognise in full the relationships between actors, the structures created and their emergent qualities. When reality failed to live up to the expectations of these theoretical constructs then somebody had to be blamed for it: planners, developers, politicians. However, from a ‘synthetic’ point of view the agents and networks involved in property development can be seen as constituents of structures that perform complex processes. These structures interact, forming new more complex structures and networks. Redevelopment then can be conceptualised as a process of transformation: a complex system, a ‘dissipative’ structure involving developers, planners, landowners, state agencies etc., unlocks the potential of previously used sites, transforms space towards a higher order of complexity and ‘consumes’ but also ‘creates’ different forms of capital in the process. Analysis of network relations point toward the ‘dualism’ of structure and agency in these processes of system transformation and change. Insights from actor network theory can be conjoined with notions of complexity and chaos to build an understanding of the ways in which actors actively seek to shape these structures and systems, whilst at the same time are recursively shaped by them in their strategies and actions. This approach transcends the blame game and allows for inter-disciplinary inputs to be placed within a broader explanatory framework that does away with many past dichotomies. Better understanding of the interactions between actors and the emergent qualities of the networks they form can improve our comprehension of the complex socio-spatial phenomena that redevelopment comprises. The insights that this framework provides when applied in UK institutional investment into redevelopment are considered to be significant.