892 resultados para Parallel computing, Virtual machine, Composition, Determinism, Abstraction
Resumo:
One of the core tasks of the virtual-manufacturing environment is to characterise the transformation of the state of material during each of the unit processes. This transformation in shape, material properties, etc. can only be reliably achieved through the use of models in a simulation context. Unfortunately, many manufacturing processes involve the material being treated in both the liquid and solid state, the trans-formation of which may be achieved by heat transfer and/or electro-magnetic fields. The computational modelling of such processes, involving the interactions amongst various interacting phenomena, is a consider-able challenge. However, it must be addressed effectively if Virtual Manufacturing Environments are to become a reality! This contribution focuses upon one attempt to develop such a multi-physics computational toolkit. The approach uses a single discretisation procedure and provides for direct interaction amongst the component phenomena. The need to exploit parallel high performance hardware is addressed so that simulation elapsed times can be brought within the realms of practicality. Examples of Multiphysics modelling in relation to shape casting, and solder joint formation reinforce the motivation for this work.
Resumo:
A parallel method for dynamic partitioning of unstructured meshes is described. The method employs a new iterative optimisation technique which both balances the workload and attempts to minimise the interprocessor communications overhead. Experiments on a series of adaptively refined meshes indicate that the algorithm provides partitions of an equivalent or higher quality to static partitioners (which do not reuse the existing partition) and much more quickly. Perhaps more importantly, the algorithm results in only a small fraction of the amount of data migration compared to the static partitioners.
Resumo:
In many areas of simulation, a crucial component for efficient numerical computations is the use of solution-driven adaptive features: locally adapted meshing or re-meshing; dynamically changing computational tasks. The full advantages of high performance computing (HPC) technology will thus only be able to be exploited when efficient parallel adaptive solvers can be realised. The resulting requirement for HPC software is for dynamic load balancing, which for many mesh-based applications means dynamic mesh re-partitioning. The DRAMA project has been initiated to address this issue, with a particular focus being the requirements of industrial Finite Element codes, but codes using Finite Volume formulations will also be able to make use of the project results.
Resumo:
A method is outlined for optimising graph partitions which arise in mapping unstructured mesh calculations to parallel computers. The method employs a relative gain iterative technique to both evenly balance the workload and minimise the number and volume of interprocessor communications. A parallel graph reduction technique is also briefly described and can be used to give a global perspective to the optimisation. The algorithms work efficiently in parallel as well as sequentially and when combined with a fast direct partitioning technique (such as the Greedy algorithm) to give an initial partition, the resulting two-stage process proves itself to be both a powerful and flexible solution to the static graph-partitioning problem. Experiments indicate that the resulting parallel code can provide high quality partitions, independent of the initial partition, within a few seconds. The algorithms can also be used for dynamic load-balancing, reusing existing partitions and in this case the procedures are much faster than static techniques, provide partitions of similar or higher quality and, in comparison, involve the migration of a fraction of the data.
Resumo:
This paper introduces the stochastic version of the Geometric Machine Model for the modelling of sequential, alternative, parallel (synchronous) and nondeterministic computations with stochastic numbers stored in a (possibly infinite) shared memory. The programming language L(D! 1), induced by the Coherence Space of Processes D! 1, can be applied to sequential and parallel products in order to provide recursive definitions for such processes, together with a domain-theoretic semantics of the Stochastic Arithmetic. We analyze both the spacial (ordinal) recursion, related to spacial modelling of the stochastic memory, and the temporal (structural) recursion, given by the inclusion relation modelling partial objects in the ordered structure of process construction.
Resumo:
A primary goal of context-aware systems is delivering the right information at the right place and right time to users in order to enable them to make effective decisions and improve their quality of life. There are three key requirements for achieving this goal: determining what information is relevant, personalizing it based on the users’ context (location, preferences, behavioral history etc.), and delivering it to them in a timely manner without an explicit request from them. These requirements create a paradigm that we term as “Proactive Context-aware Computing”. Most of the existing context-aware systems fulfill only a subset of these requirements. Many of these systems focus only on personalization of the requested information based on users’ current context. Moreover, they are often designed for specific domains. In addition, most of the existing systems are reactive - the users request for some information and the system delivers it to them. These systems are not proactive i.e. they cannot anticipate users’ intent and behavior and act proactively without an explicit request from them. In order to overcome these limitations, we need to conduct a deeper analysis and enhance our understanding of context-aware systems that are generic, universal, proactive and applicable to a wide variety of domains. To support this dissertation, we explore several directions. Clearly the most significant sources of information about users today are smartphones. A large amount of users’ context can be acquired through them and they can be used as an effective means to deliver information to users. In addition, social media such as Facebook, Flickr and Foursquare provide a rich and powerful platform to mine users’ interests, preferences and behavioral history. We employ the ubiquity of smartphones and the wealth of information available from social media to address the challenge of building proactive context-aware systems. We have implemented and evaluated a few approaches, including some as part of the Rover framework, to achieve the paradigm of Proactive Context-aware Computing. Rover is a context-aware research platform which has been evolving for the last 6 years. Since location is one of the most important context for users, we have developed ‘Locus’, an indoor localization, tracking and navigation system for multi-story buildings. Other important dimensions of users’ context include the activities that they are engaged in. To this end, we have developed ‘SenseMe’, a system that leverages the smartphone and its multiple sensors in order to perform multidimensional context and activity recognition for users. As part of the ‘SenseMe’ project, we also conducted an exploratory study of privacy, trust, risks and other concerns of users with smart phone based personal sensing systems and applications. To determine what information would be relevant to users’ situations, we have developed ‘TellMe’ - a system that employs a new, flexible and scalable approach based on Natural Language Processing techniques to perform bootstrapped discovery and ranking of relevant information in context-aware systems. In order to personalize the relevant information, we have also developed an algorithm and system for mining a broad range of users’ preferences from their social network profiles and activities. For recommending new information to the users based on their past behavior and context history (such as visited locations, activities and time), we have developed a recommender system and approach for performing multi-dimensional collaborative recommendations using tensor factorization. For timely delivery of personalized and relevant information, it is essential to anticipate and predict users’ behavior. To this end, we have developed a unified infrastructure, within the Rover framework, and implemented several novel approaches and algorithms that employ various contextual features and state of the art machine learning techniques for building diverse behavioral models of users. Examples of generated models include classifying users’ semantic places and mobility states, predicting their availability for accepting calls on smartphones and inferring their device charging behavior. Finally, to enable proactivity in context-aware systems, we have also developed a planning framework based on HTN planning. Together, these works provide a major push in the direction of proactive context-aware computing.
Resumo:
Based on the presupposition that the arts in the West always counted on resources, supports, and devices pertaining to its time context, an reflection is intended regarding the scenic compositions mediated by digital technologies do. Such technologies are inserted in the daily routine, also composing artistic experiments, thus playing a dialogical role with the art/technology intersection. Therefore, the proposal is to investigate what relationships are established in the contemporary theatrical scene from the contagion by digital technologies, aiming at establishing this parallel through a dialogue with the authors discussing the subject, and also based on the group practices having technological resources as a determinant factor in their plays. Furthermore, a reflection should be made on the scene that incorporates or is carried out in intermediatic events, analyzing how digital technologies (re)configure compositional processes of the plays by GAG Phila7, in the city of São Paulo/SP. For such, the dissertation is organized in three sections comprising four moments, to wit: brief overview of the field, contextualization, poetic analysis and synthesis. Qualitative methods are used as the methodological proposal: semi-structure interview, note and document taking (program, website, playing book, disclosure material for advertising text, photographs, and videos). Within the universe of qualitative research, it works with the epistemological perspective of the Gadamer philosophical hermeneutics. The possibilities allowed by the double virtual (Internet/web) generated a type of theater with another material basis and new forms of organization and structure, being possible to perceive that such technological advances and the arts are mutually contaminated, generating a dislocation in the logics of theatrical composition, movement beginning with the artistic vanguards, gradually intensified, thus offering new possibilities of constructions and hybridization of the of the most different possible types. Experiment ―Profanações_superfície de eventos de construção coletiva‖, idealized by Phila7 is inserted in this perspective. Object of the discussion of such research, the experiment works with possible poetics arising from the intersection with the digital technologies, aiming at identifying and problematizing the challenges from the technological evolution and expansion in a scenic context
Resumo:
El presente texto ofrece algunas reflexiones teóricas y visionesdel autor sobre la relación del hombre de hoy con el fenómeno dela comunicación digital, también conocida como comunicaciónvirtual, a través de Internet, unas reflexiones que aparecen deltrabajo investigativo sobre el modelo convergente del canalde televisión pública Telemedellín, y en las que se explora elpanorama de las relaciones del hombre con el otro, consigomismo y con las cosas, desde la mediación de la computadora.Por otro lado, indaga sobre la composición de las redes socialesy su producto, la comunidad virtual, lugar en el que se producenlos intercambios e interrelaciones humanas. Y finalmente,intenta mostrar el desasosiego del hombre de la época actual,encontrándolo como un ser solitario que lucha por un lugar enel mundo.
Resumo:
Mobile Notebooks, die einen campusweiten Zugriff auf das Hochschulnetzwerk erlauben, eröffnen neue Möglichkeiten der Integration netzbasierter Ressourcen in die reguläre Lehre. Es wird über Entwicklung und gebündelten Einsatz netzbasierter Tools in der psychologischen Grundlagenausbildung berichtet. Spezifische Funktionalitäten mobiler Notebooks wurden in vielfältigen Anwendungen – von Online-Feedback-Instrumenten bis zum virtuellen Experimentallabor – zur Förderung von Lehr-Lern-Prozessen nutzbar gemacht. Sie fördern die individuelle Wissenskonstruktion, indem sie selbstreguliertes und kooperatives Lernen vernetzen, unmittelbares Feedback gewährleisten sowie darüber hinaus die Entwicklung sozialer Bezugsnormen unterstützen. So schaffen sie einen Rahmen, in dem die Studierenden – dem Cognitive Apprenticeship Ansatz folgend – auf ihrem Weg in die wissenschaftliche Community experimentell arbeitender Psychologen von Mitlernenden und Lehrenden unterstützt werden. Innerhalb nur eines Semesters konnten mobile Notebooks und netzbasierte Tools erfolgreich in die reguläre Lehre integriert werden. Die kognitiven und affektiven Grundlagen einer nachhaltigen Verbesserung der Lehr-Lern-Qualität durch den Einsatz derartiger Instrumente werden diskutiert.(DIPF/Orig.)
Resumo:
Datacenters have emerged as the dominant form of computing infrastructure over the last two decades. The tremendous increase in the requirements of data analysis has led to a proportional increase in power consumption and datacenters are now one of the fastest growing electricity consumers in the United States. Another rising concern is the loss of throughput due to network congestion. Scheduling models that do not explicitly account for data placement may lead to a transfer of large amounts of data over the network causing unacceptable delays. In this dissertation, we study different scheduling models that are inspired by the dual objectives of minimizing energy costs and network congestion in a datacenter. As datacenters are equipped to handle peak workloads, the average server utilization in most datacenters is very low. As a result, one can achieve huge energy savings by selectively shutting down machines when demand is low. In this dissertation, we introduce the network-aware machine activation problem to find a schedule that simultaneously minimizes the number of machines necessary and the congestion incurred in the network. Our model significantly generalizes well-studied combinatorial optimization problems such as hard-capacitated hypergraph covering and is thus strongly NP-hard. As a result, we focus on finding good approximation algorithms. Data-parallel computation frameworks such as MapReduce have popularized the design of applications that require a large amount of communication between different machines. Efficient scheduling of these communication demands is essential to guarantee efficient execution of the different applications. In the second part of the thesis, we study the approximability of the co-flow scheduling problem that has been recently introduced to capture these application-level demands. Finally, we also study the question, "In what order should one process jobs?'' Often, precedence constraints specify a partial order over the set of jobs and the objective is to find suitable schedules that satisfy the partial order. However, in the presence of hard deadline constraints, it may be impossible to find a schedule that satisfies all precedence constraints. In this thesis we formalize different variants of job scheduling with soft precedence constraints and conduct the first systematic study of these problems.
Resumo:
Building and maintaining muscle is critical to the quality of life for adults and elderly. Physical activity and nutrition are important factors for long-term muscle health. In particular, dietary protein – including protein distribution and quality – are under-appreciated determinants of muscle health for adults. The most unequivocal evidence for the benefit of optimal dietary protein at individual meals is derived from studies of weight management. During the catabolic condition of weight loss, higher protein diets attenuate loss of lean tissue and partition weight loss to body fat when compared with commonly recommended high carbohydrate, low protein diets. Muscle protein turnover is a continuous process in which proteins are degraded, and replaced by newly synthesized proteins. Muscle growth occurs when protein synthesis exceeds protein degradation. Regulation of protein synthesis is complex, with multiple signals influencing this process. The mammalian target of rapamycin (mTORC1) pathway has been identified as a particularly important regulator of protein synthesis, via stimulation of translation initiation. Key regulatory points of translation initiation effected by mTORC1 include assembly of the eukaryotic initiation factor 4F (eIF4F) complex and phosphorylation of the 70 kilodalton ribosomal protein S6 kinase (S6K1). Assembly of the eIF4F initiation complex involves phosphorylation of the inhibitory eIF4E binding protein-1 (4E-BP1), which releases the initiation factor eIF4E and allows it to bind with eIF4G. Binding of eIF4E with eIF4G promotes preparation of the mRNA for binding to the 43S pre-initiation complex. Consumption of the amino acid leucine (Leu) is a key factor determining the anabolic response of muscle protein synthesis (MPS) and mTORC1 signaling to a meal. Research from this dissertation demonstrates that the peak activation of MPS following a complete meal is proportional to the Leu content of a meal and its ability to elevate plasma Leu. Leu has also been implicated as an inhibitor of muscle protein degradation (MPD). In particular, there is evidence suggesting that in muscle wasting conditions Leu supplementation attenuates expression of the ubiquitin-proteosome pathway, which is the primary mode of intracellular protein degradation. However, this is untested in healthy, physiological feeding models. Therefore, an experiment was performed to see if feeding isonitrogenous protein sources with different Leu contents to healthy adult rats would differentially impact ubiquitin-proteosome (protein degradation) outcomes; and if these outcomes are related to the meal responses of plasma Leu. Results showed that higher Leu diets were able to attenuate total proteasome content but had no effect on ubiquitin proteins. This research shows that dietary Leu determines postprandial muscle anabolism. In a parallel line of research, the effects of dietary Leu on changes in muscle mass overtime were investigated. Animals consuming higher Leu diets had larger gastrocnemius muscle weights; furthermore, gastrocnemius muscle weights were correlated with postprandial changes in MPS (r=0.471, P<0.01) and plasma Leu (r=0.400, P=0.01). These results show that the effect of Leu on ubiquitin-proteosome pathways is minimal for healthy adult rats consuming adequate diets. Thus, long-term changes in muscle mass observed in adult rats are likely due to the differences in MPS, rather than MPD. Factors determining the duration of Leu-stimulated MPS were further investigated. Despite continued elevations in plasma Leu and associated translation initiation factors (e.g., S6K1 and 4E-BP1), MPS returned to basal levels ~3 hours after a meal. However, administration of additional nutrients in the form of carbohydrate, Leu, or both ~2 hours after a meal was able to extend the elevation of MPS, in a time and dose dependent manner. This effect led to a novel discovery that decreases in translation elongation activity was associated with increases in activity of AMP kinase, a key cellular energy sensor. This research shows that the Leu density of dietary protein determines anabolic signaling, thereby affecting cellular energetics and body composition.
Resumo:
199 p.
Resumo:
Virtual Screening (VS) methods can considerably aid clinical research, predicting how ligands interact with drug targets. However, the accuracy of most VS methods is constrained by limitations in the scoring function that describes biomolecular interactions, and even nowadays these uncertainties are not completely understood. In order to improve accuracy of scoring functions used in most VS methods we propose a hybrid novel approach where neural networks (NNET) and support vector machines (SVM) methods are trained with databases of known active (drugs) and inactive compounds, this information being exploited afterwards to improve VS predictions.
Resumo:
Part 18: Optimization in Collaborative Networks
Resumo:
Part 17: Risk Analysis