788 resultados para Complementary computing
Resumo:
Key message The potential for exploiting heterosis for sorghum hybrid production in Ethiopia with improved local adaptation and farmers preferences has been investigated and populations suitable for initial hybrid development have been identified. Abstract Hybrids in sorghum have demonstrated increased productivity and stability of performance in the developed world. In Ethiopia, the uptake of hybrid sorghum has been limited to date, primarily due to poor adaptation and absence of farmer’s preferred traits in existing hybrids. This study aimed to identify complementary parental pools to develop locally adapted hybrids, through an analysis of whole genome variability of 184 locally adapted genotypes and introduced hybrid parents (R and B). Genetic variability was assessed using genetic distance, model-based STRUCTURE analysis and pair-wise comparison of groups. We observed a high degree of genetic similarity between the Ethiopian improved inbred genotypes and a subset of landraces adapted to lowland agro-ecology with the introduced R lines. This coupled with the genetic differentiation from existing B lines, indicated that these locally adapted genotype groups are expected to have similar patterns of heterotic expression as observed between introduced R and B line pools. Additionally, the hybrids derived from these locally adapted genotypes will have the benefit of containing farmers preferred traits. The groups most divergent from introduced B lines were the Ethiopian landraces adapted to highland and intermediate agro-ecologies and a subset of lowland-adapted genotypes, indicating the potential for increased heterotic response of their hybrids. However, these groups were also differentiated from the R lines, and hence are different from the existing complementary heterotic pools. This suggests that although these groups could provide highly divergent parental pools, further research is required to investigate the extent of heterosis and their hybrid performance.
Resumo:
Minimum Description Length (MDL) is an information-theoretic principle that can be used for model selection and other statistical inference tasks. There are various ways to use the principle in practice. One theoretically valid way is to use the normalized maximum likelihood (NML) criterion. Due to computational difficulties, this approach has not been used very often. This thesis presents efficient floating-point algorithms that make it possible to compute the NML for multinomial, Naive Bayes and Bayesian forest models. None of the presented algorithms rely on asymptotic analysis and with the first two model classes we also discuss how to compute exact rational number solutions.
Resumo:
Ubiquitous computing is about making computers and computerized artefacts a pervasive part of our everyday lifes, bringing more and more activities into the realm of information. The computationalization, informationalization of everyday activities increases not only our reach, efficiency and capabilities but also the amount and kinds of data gathered about us and our activities. In this thesis, I explore how information systems can be constructed so that they handle this personal data in a reasonable manner. The thesis provides two kinds of results: on one hand, tools and methods for both the construction as well as the evaluation of ubiquitous and mobile systems---on the other hand an evaluation of the privacy aspects of a ubiquitous social awareness system. The work emphasises real-world experiments as the most important way to study privacy. Additionally, the state of current information systems as regards data protection is studied. The tools and methods in this thesis consist of three distinct contributions. An algorithm for locationing in cellular networks is proposed that does not require the location information to be revealed beyond the user's terminal. A prototyping platform for the creation of context-aware ubiquitous applications called ContextPhone is described and released as open source. Finally, a set of methodological findings for the use of smartphones in social scientific field research is reported. A central contribution of this thesis are the pragmatic tools that allow other researchers to carry out experiments. The evaluation of the ubiquitous social awareness application ContextContacts covers both the usage of the system in general as well as an analysis of privacy implications. The usage of the system is analyzed in the light of how users make inferences of others based on real-time contextual cues mediated by the system, based on several long-term field studies. The analysis of privacy implications draws together the social psychological theory of self-presentation and research in privacy for ubiquitous computing, deriving a set of design guidelines for such systems. The main findings from these studies can be summarized as follows: The fact that ubiquitous computing systems gather more data about users can be used to not only study the use of such systems in an effort to create better systems but in general to study phenomena previously unstudied, such as the dynamic change of social networks. Systems that let people create new ways of presenting themselves to others can be fun for the users---but the self-presentation requires several thoughtful design decisions that allow the manipulation of the image mediated by the system. Finally, the growing amount of computational resources available to the users can be used to allow them to use the data themselves, rather than just being passive subjects of data gathering.
Resumo:
There is an increase in the uptake of cloud computing services (CCS). CCS is adopted in the form of a utility, and it incorporates business risks of the service providers and intermediaries. Thus, the adoption of CCS will change the risk profile of an organization. In this situation, organisations need to develop competencies by reconsidering their IT governance structures to achieve a desired level of IT-business alignment and maintain their risk appetite to source business value from CCS. We use the resource-based theories to suggest that collaborative board oversight of CCS, competencies relating to CCS information and financial management, and a CCS-related continuous audit program can contribute to business process performance improvements and overall firm performance. Using survey data, we find evidence of a positive association between these IT governance considerations and business process performance. We also find evidence of positive association between business process performance improvements and overall firm performance. The results suggest that the suggested considerations on IT governance structures can contribute to CCS-related IT-business alignment and lead to anticipated business value from CCS. This study provides guidance to organizations on competencies required to secure business value from CCS.
Resumo:
The concept of cloud computing services (CCS) is appealing to small and medium enterprises (SMEs). However, while there is a significant push by various authorities on SMEs to adopt the CCS, knowledge of the key considerations to adopt the CCS is very limited. We use the technology-organization-environment (TOE) framework to suggest that a strategic and incremental intent, understanding the organizational structure and culture, understanding the external factors, and consideration of the human resource capacity can contribute to sustainable business value from CCS. Using survey data, we find evidence of a positive association between these considerations and the CCS-related business objectives. We also find evidence of positive association between the CCS-related business objectives and CCS-related financial objectives. The results suggest that the proposed considerations can ensure sustainable business value from the CCS. This study provides guidance to SMEs on a path to adopting the CCS with the intention of a long-term commitment and achieving sustainable business value from these services.
Resumo:
There is an increase in the uptake of cloud computing services (CCS). CCS is adopted in the form of a utility, and it incorporates business risks of the service providers and intermediaries. Thus, the adoption of CCS will change the risk profile of an organization. In this situation, organizations need to develop competencies by reconsidering their IT governance structures to achieve a desired level of IT-business alignment and maintain their risk appetite to source business value from CCS. We use the resource-based theories to suggest that collaborative board oversight of CCS, competencies relating to CCS information and financial management, and a CCS-related continuous audit program can contribute to business process performance improvements and overall firm performance. Using survey data, we find evidence of a positive association between these IT governance considerations and business process performance. We also find evidence of positive association between business process performance improvements and overall firm performance. The results suggest that the suggested considerations on IT governance structures can contribute to CCS-related IT-business alignment and lead to anticipated business value from CCS. This study provides guidance to organizations on competencies required to secure business value from CCS.
Resumo:
Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.
Resumo:
Among the iterative schemes for computing the Moore — Penrose inverse of a woll-conditioned matrix, only those which have an order of convergence three or two are computationally efficient. A Fortran programme for these schemes is provided.
Resumo:
Abstract is not available.
Resumo:
A rank-augmnented LU-algorithm is suggested for computing a generalized inverse of a matrix. Initially suitable diagonal corrections are introduced in (the symmetrized form of) the given matrix to facilitate decomposition; a backward-correction scheme then yields a desired generalized inverse.
Resumo:
We present here a theoretical approach to compute the molecular magnetic anisotropy parameters, D (M) and E (M) for single molecule magnets in any given spin eigenstate of exchange spin Hamiltonian. We first describe a hybrid constant M (S) valence bond (VB) technique of solving spin Hamiltonians employing full spatial and spin symmetry adaptation and we illustrate this technique by solving the exchange Hamiltonian of the Cu6Fe8 system. Treating the anisotropy Hamiltonian as perturbation, we compute the D (M)and E(M) values for various eigenstates of the exchange Hamiltonian. Since, the dipolar contribution to the magnetic anisotropy is negligibly small, we calculate the molecular anisotropy from the single-ion anisotropies of the metal centers. We have studied the variation of D (M) and E(M) by rotating the single-ion anisotropies in the case of Mn12Ac and Fe-8 SMMs in ground and few low-lying excited states of the exchange Hamiltonian. In both the systems, we find that the molecular anisotropy changes drastically when the single-ion anisotropies are rotated. While in Mn12Ac SMM D (M) values depend strongly on the spin of the eigenstate, it is almost independent of the spin of the eigenstate in Fe-8 SMM. We also find that the D (M)value is almost insensitive to the orientation of the anisotropy of the core Mn(IV) ions. The dependence of D (M) on the energy gap between the ground and the excited states in both the systems has also been studied by using different sets of exchange constants.
Resumo:
It has been said that we are living in a golden age of innovation. New products, systems and services aimed to enable a better future, have emerged from novel interconnections between design and design research with science, technology and the arts. These intersections are now, more than ever, catalysts that enrich daily activities for health and safety, education, personal computing, entertainment and sustainability, to name a few. Interactive functions made possible by new materials, technology, and emerging manufacturing solutions demonstrate an ongoing interplay between cross-disciplinary knowledge and research. Such interactive interplay bring up questions concerning: (i) how art and design provide a focus for developing design solutions and research in technology; (ii) how theories emerging from the interactions of cross-disciplinary knowledge inform both the practice and research of design and (iii) how research and design work together in a mutually beneficial way. The IASDR2015 INTERPLAY EXHIBITION provides some examples of these interconnections of design research with science, technology and the arts. This is done through the presentation of objects, artefacts and demonstrations that are contextualised into everyday activities across various areas including health, education, safety, furniture, fashion and wearable design. The exhibits provide a setting to explore the various ways in which design research interacts across discipline knowledge and approaches to stimulate innovation. In education, Designing South African Children’s Health Education as Generative Play (A Bennett, F Cassim, M van der Merwe, K van Zijil, and M Ribbens) presents a set of toolkits that resulted from design research entailing generative play. The toolkits are systems that engender pleasure and responsibility, and are aimed at cultivating South African’s youth awareness of nutrition, hygiene, disease awareness and prevention, and social health. In safety, AVAnav: Avalanche Rescue Helmet (Jason Germany) delivers an interactive system as a tool to contribute to reduce the time to locate buried avalanche victims. Helmet-mounted this system responds to the contextual needs of rescuers and has since led to further design research on the interface design of rescuing devices. In apparel design and manufacturing, Shrinking Violets: Fashion design for disassembly (Alice Payne) proposes a design for disassembly through the use of beautiful reversible mono-material garments that interactively responds to the challenges of garment construction in the fashion industry, capturing the metaphor for the interplay between technology and craft in the fashion manufacturing industry. Harvest: A biotextile future (Dean Brough and Alice Payne), explores the interplay of biotechnology, materiality and textile design in the creation of sustainable, biodegradable vegan textile through the process of a symbiotic culture of bacteria and yeast (SCOBY). SCOBY is a pellicle curd that can be harvested, machine washed, dried and cut into a variety of designs and texture combinations. The exploration of smart materials, wearable design and micro-electronics led to creative and aesthetically coherent stimulus-reactive jewellery; Symbiotic Microcosms: Crafting Digital Interaction (K Vones). This creation aims to bridge the gap between craft practitioner and scientific discovery, proposing a move towards the notion of a post-human body, where wearable design is seen as potential ground for new human-computer interactions, affording the development of visually engaging multifunctional enhancements. In furniture design, Smart Assistive chair for older adults (Chao Zhao) demonstrates how cross-disciplinary knowledge interacting with design strategies provide solution that employed new technological developments in older aged care, and the participation of multiple stakeholders: designers, health care system and community based health systems. In health, Molecular diagnosis system for newborns deafness genetic screening (Chao Zhao) presents an ambitious and complex project that includes a medical device aimed at resolving a number of challenges: technical feasibility for city and rural contexts, compatibility with standard laboratory and hospital systems, access to health system, and support the work of different hospital specialists. The interplay between cross-disciplines is evident in this work, demonstrating how design research moves forward through technology developments. These works exemplify the intersection between domains as a means to innovation. Novel design problems are identified as design intersects with the various areas. Research informs this process, and in different ways. We see the background investigation into the contextualising domain (e.g. on-snow studies, garment recycling, South African health concerns, the post human body) to identify gaps in the area and design criteria; the technologies and materials reviews (e.g. AR, biotextiles) to offer plausible technical means to solve these, as well as design criteria. Theoretical reviews can also inform the design (e.g. play, flow). These work together to equip the design practitioner with a robust set of ‘tools’ for design innovation – tools that are based in research. The process identifies innovative opportunity and criteria for design and this, in turn, provides a means for evaluating the success of the design outcomes. Such an approach has the potential to come full circle between research and design – where the design can function as an exemplar, evidencing how the research-articulated problems can be solved. Core to this, however, is the evaluation of the design outcome itself and identifying knowledge outcomes. In some cases, this is fairly straightforward that is, easily measurable. For example the efficacy of Jason Germany’s helmet can be determined by measuring the reduced response time in the rescuer. Similarly the improved ability to recycle Payne’s panel garments can be clearly determined by comparing it to those recycling processes (and her identified criteria of separating textile elements!); while the sustainability and durability of the Brough & Payne’s biotextile can be assessed by documenting the growth and decay processes, or comparative strength studies. There are however situations where knowledge outcomes and insights are not so easily determined. Many of the works here are open-ended in their nature, as they emphasise the holistic experience of one or more designs, in context: “the end result of the art activity that provides the health benefit or outcome but rather, the value lies in the delivery and experience of the activity” (Bennet et al.) Similarly, reconfiguring layers of laser cut silk in Payne’s Shrinking Violets constitutes a customisable, creative process of clothing oneself since it “could be layered to create multiple visual effects”. Symbiotic Microcosms also has room for facilitating experience, as the work is described to facilitate “serendipitous discovery”. These examples show the diverse emphasis of enquiry as on the experience versus the product. Open-ended experiences are ambiguous, multifaceted and differ from person to person and moment to moment (Eco 1962). Determining the success is not always clear or immediately discernible; it may also not be the most useful question to ask. Rather, research that seeks to understand the nature of the experience afforded by the artefact is most useful in these situations. It can inform the design practitioner by helping them with subsequent re-design as well as potentially being generalizable to other designers and design contexts. Bennett et. al exemplify how this may be approached from a theoretical perspective. This work is concerned with facilitating engaging experiences to educate and, ultimately impact on that community. The research is concerned with the nature of that experience as well, and in order to do so the authors have employed theoretical lenses – here these are of flow, pleasure, play. An alternative or complementary approach to using theory, is using qualitative studies such as interviews with users to ask them about what they experienced? Here the user insights become evidence for generalising across, potentially revealing insight into relevant concerns – such as the range of possible ‘playful’ or experiences that may be afforded, or the situation that preceded a ‘serendipitous discovery’. As shown, IASDR2015 INTERPLAY EXHIBITION provides a platform for exploration, discussion and interrogation around the interplay of design research across diverse domains. We look forward with excitement as IASDR continues to bring research and design together, and as our communities of practitioners continue to push the envelope of what is design and how this can be expanded and better understood with research to foster new work and ultimately, stimulate innovation.
Resumo:
With the development of wearable and mobile computing technology, more and more people start using sleep-tracking tools to collect personal sleep data on a daily basis aiming at understanding and improving their sleep. While sleep quality is influenced by many factors in a person’s lifestyle context, such as exercise, diet and steps walked, existing tools simply visualize sleep data per se on a dashboard rather than analyse those data in combination with contextual factors. Hence many people find it difficult to make sense of their sleep data. In this paper, we present a cloud-based intelligent computing system named SleepExplorer that incorporates sleep domain knowledge and association rule mining for automated analysis on personal sleep data in light of contextual factors. Experiments show that the same contextual factors can play a distinct role in sleep of different people, and SleepExplorer could help users discover factors that are most relevant to their personal sleep.
Resumo:
We compute the throughput obtained by a TCP connection in a UMTS environment. For downloading data at a mobile terminal, the packets of each TCP connection are stored in separate queues at the base station (node B). Also due to fragmentation of the TCP packets into Protocol Data Units (PDU) and link layer retransmissions of PDUs there can be significant delays at the queue of the node B. In such a scenario the existing models of TCP may not be sufficient. Thus, we provide a new approximate TCP model and also obtain new closed-form expressions of mean window size. Using these we obtain the throughput of a TCP connection which matches with simulations quite well.
Resumo:
The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.