908 resultados para global positioning systems
Resumo:
We propose a novel information-theoretic approach for Bayesian optimization called Predictive Entropy Search (PES). At each iteration, PES selects the next evaluation point that maximizes the expected information gained with respect to the global maximum. PES codifies this intractable acquisition function in terms of the expected reduction in the differential entropy of the predictive distribution. This reformulation allows PES to obtain approximations that are both more accurate and efficient than other alternatives such as Entropy Search (ES). Furthermore, PES can easily perform a fully Bayesian treatment of the model hyperparameters while ES cannot. We evaluate PES in both synthetic and real-world applications, including optimization problems in machine learning, finance, biotechnology, and robotics. We show that the increased accuracy of PES leads to significant gains in optimization performance.
Resumo:
A novel method for positioning of InAs islands on GaAs (110) by cleaved edge overgrowth is reported. The first growth sample contains strained InxGa1-xAs/GaAs superlattice (SL) of varying indium fraction, which acts as a strain nanopattern for the cleaved-edge overgrowth. Atoms incident on the cleaved edge will preferentially migrate to InGaAs regions where favorable bonding sites are available. By this method InAs island chains with lateral periodicity defined by the thickness of InGaAs and GaAs of SL have been realized by molecular beam epitaxy (MBE). They are observed by means of atomic force microscopy (AFM). The strain nanopattern's effect is studied by the different indium fraction of SL and MBE growth conditions. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Knowledge Innovation Project of Chinese Academy of Sciences [KZCX3-SW-347]; National Science Fund for Distinguished Young Scholar [40225004]
Resumo:
Malicious software (malware) have significantly increased in terms of number and effectiveness during the past years. Until 2006, such software were mostly used to disrupt network infrastructures or to show coders’ skills. Nowadays, malware constitute a very important source of economical profit, and are very difficult to detect. Thousands of novel variants are released every day, and modern obfuscation techniques are used to ensure that signature-based anti-malware systems are not able to detect such threats. This tendency has also appeared on mobile devices, with Android being the most targeted platform. To counteract this phenomenon, a lot of approaches have been developed by the scientific community that attempt to increase the resilience of anti-malware systems. Most of these approaches rely on machine learning, and have become very popular also in commercial applications. However, attackers are now knowledgeable about these systems, and have started preparing their countermeasures. This has lead to an arms race between attackers and developers. Novel systems are progressively built to tackle the attacks that get more and more sophisticated. For this reason, a necessity grows for the developers to anticipate the attackers’ moves. This means that defense systems should be built proactively, i.e., by introducing some security design principles in their development. The main goal of this work is showing that such proactive approach can be employed on a number of case studies. To do so, I adopted a global methodology that can be divided in two steps. First, understanding what are the vulnerabilities of current state-of-the-art systems (this anticipates the attacker’s moves). Then, developing novel systems that are robust to these attacks, or suggesting research guidelines with which current systems can be improved. This work presents two main case studies, concerning the detection of PDF and Android malware. The idea is showing that a proactive approach can be applied both on the X86 and mobile world. The contributions provided on this two case studies are multifolded. With respect to PDF files, I first develop novel attacks that can empirically and optimally evade current state-of-the-art detectors. Then, I propose possible solutions with which it is possible to increase the robustness of such detectors against known and novel attacks. With respect to the Android case study, I first show how current signature-based tools and academically developed systems are weak against empirical obfuscation attacks, which can be easily employed without particular knowledge of the targeted systems. Then, I examine a possible strategy to build a machine learning detector that is robust against both empirical obfuscation and optimal attacks. Finally, I will show how proactive approaches can be also employed to develop systems that are not aimed at detecting malware, such as mobile fingerprinting systems. In particular, I propose a methodology to build a powerful mobile fingerprinting system, and examine possible attacks with which users might be able to evade it, thus preserving their privacy. To provide the aforementioned contributions, I co-developed (with the cooperation of the researchers at PRALab and Ruhr-Universität Bochum) various systems: a library to perform optimal attacks against machine learning systems (AdversariaLib), a framework for automatically obfuscating Android applications, a system to the robust detection of Javascript malware inside PDF files (LuxOR), a robust machine learning system to the detection of Android malware, and a system to fingerprint mobile devices. I also contributed to develop Android PRAGuard, a dataset containing a lot of empirical obfuscation attacks against the Android platform. Finally, I entirely developed Slayer NEO, an evolution of a previous system to the detection of PDF malware. The results attained by using the aforementioned tools show that it is possible to proactively build systems that predict possible evasion attacks. This suggests that a proactive approach is crucial to build systems that provide concrete security against general and evasion attacks.
Resumo:
Tedd, L.A. & Large, A. (2005). Digital libraries: principles and practice in a global environment. Munich: K.G. Saur.
Resumo:
Grattan, J.P., Gilbertson, D.D., Hunt, C.O. (2007). The local and global dimensions of metaliferrous air pollution derived from a reconstruction of an 8 thousand year record of copper smelting and mining at a desert-mountain frontier in southern Jordan. Journal of Archaeological Science 34, 83-110
Resumo:
The Science of Network Service Composition has clearly emerged as one of the grand themes driving many of our research questions in the networking field today [NeXtworking 2003]. This driving force stems from the rise of sophisticated applications and new networking paradigms. By "service composition" we mean that the performance and correctness properties local to the various constituent components of a service can be readily composed into global (end-to-end) properties without re-analyzing any of the constituent components in isolation, or as part of the whole composite service. The set of laws that would govern such composition is what will constitute that new science of composition. The combined heterogeneity and dynamic open nature of network systems makes composition quite challenging, and thus programming network services has been largely inaccessible to the average user. We identify (and outline) a research agenda in which we aim to develop a specification language that is expressive enough to describe different components of a network service, and that will include type hierarchies inspired by type systems in general programming languages that enable the safe composition of software components. We envision this new science of composition to be built upon several theories (e.g., control theory, game theory, network calculus, percolation theory, economics, queuing theory). In essence, different theories may provide different languages by which certain properties of system components can be expressed and composed into larger systems. We then seek to lift these lower-level specifications to a higher level by abstracting away details that are irrelevant for safe composition at the higher level, thus making theories scalable and useful to the average user. In this paper we focus on services built upon an overlay management architecture, and we use control theory and QoS theory as example theories from which we lift up compositional specifications.
Resumo:
— Consideration of how people respond to the question What is this? has suggested new problem frontiers for pattern recognition and information fusion, as well as neural systems that embody the cognitive transformation of declarative information into relational knowledge. In contrast to traditional classification methods, which aim to find the single correct label for each exemplar (This is a car), the new approach discovers rules that embody coherent relationships among labels which would otherwise appear contradictory to a learning system (This is a car, that is a vehicle, over there is a sedan). This talk will describe how an individual who experiences exemplars in real time, with each exemplar trained on at most one category label, can autonomously discover a hierarchy of cognitive rules, thereby converting local information into global knowledge. Computational examples are based on the observation that sensors working at different times, locations, and spatial scales, and experts with different goals, languages, and situations, may produce apparently inconsistent image labels, which are reconciled by implicit underlying relationships that the network’s learning process discovers. The ARTMAP information fusion system can, moreover, integrate multiple separate knowledge hierarchies, by fusing independent domains into a unified structure. In the process, the system discovers cross-domain rules, inferring multilevel relationships among groups of output classes, without any supervised labeling of these relationships. In order to self-organize its expert system, the ARTMAP information fusion network features distributed code representations which exploit the model’s intrinsic capacity for one-to-many learning (This is a car and a vehicle and a sedan) as well as many-to-one learning (Each of those vehicles is a car). Fusion system software, testbed datasets, and articles are available from http://cns.bu.edu/techlab.
Resumo:
The causes of antibiotic resistance are complex and include human behaviour at many levels of society; the consequences affect everybody in the world. Similarities with climate change are evident. Many efforts have been made to describe the many different facets of antibiotic resistance and the interventions needed to meet the challenge. However, coordinated action is largely absent, especially at the political level, both nationally and internationally. Antibiotics paved the way for unprecedented medical and societal developments, and are today indispensible in all health systems. Achievements in modern medicine, such as major surgery, organ transplantation, treatment of preterm babies, and cancer chemotherapy, which we today take for granted, would not be possible without access to effective treatment for bacterial infections. Within just a few years, we might be faced with dire setbacks, medically, socially, and economically, unless real and unprecedented global coordinated actions are immediately taken. Here, we describe the global situation of antibiotic resistance, its major causes and consequences, and identify key areas in which action is urgently needed.
Resumo:
The problem of deriving parallel mesh partitioning algorithms for mapping unstructured meshes to parallel computers is discussed in this chapter. In itself this raises a paradox - we seek to find a high quality partition of the mesh, but to compute it in parallel we require a partition of the mesh. In fact, we overcome this difficulty by deriving an optimisation strategy which can find a high quality partition even if the quality of the initial partition is very poor and then use a crude distribution scheme for the initial partition. The basis of this strategy is to use a multilevel approach combined with local refinement algorithms. Three such refinement algorithms are outlined and some example results presented which show that they can produce very high global quality partitions, very rapidly. The results are also compared with a similar multilevel serial partitioner and shown to be almost identical in quality. Finally we consider the impact of the initial partition on the results and demonstrate that the final partition quality is, modulo a certain amount of noise, independent of the initial partition.
Resumo:
Sustainable development depends on maintaining ecosystem services which are concentrated in coastal marine and estuarine ecosystems. Analyses of the science needed to manage human uses of ecosystem services have concentrated on terrestrial ecosystems. Our focus is on the provision of multidisciplinary data needed to inform adaptive, ecosystem-based approaches (EBAs) for maintaining coastal ecosystem services based on comparative ecosystem analyses. Key indicators of pressures on coastal ecosystems, ecosystem states and the impacts of changes in states on services are identified for monitoring and analysis at a global coastal network of sentinel sites nested in the ocean-climate observing system. Biodiversity is targeted as the “master” indicator because of its importance to a broad spectrum of services. Ultimately, successful implementation of EBAs will depend on establishing integrated, holistic approaches to ocean governance that oversee the development of integrated, operational ocean observing systems based on the data and information requirements specified by a broad spectrum of stakeholders for sustainable development. Sustained engagement of such a spectrum of stakeholders on a global scale is not feasible. The global coastal network will need to be customized locally and regionally based on priorities established by stakeholders in their respective regions. The E.U. Marine Strategy Framework Directive and the U.S. Recommendations of the Interagency Ocean Policy Task Force are important examples of emerging regional scale approaches. The effectiveness of these policies will depend on the co-evolution of ocean policy and the observing system under the auspices of integrated ocean governance.
Resumo:
Evidence of global warming is now unequivocal, and studies suggest that it has started to influence natural systems of the planet, including the oceans. However, in the marine environment, it is well-known that species and ecosystems can also be influenced by natural sources of large-scale hydro-climatological variability. The North Atlantic Oscillation (NAO) was negatively correlated with the mean abundance of one of the subarctic key species Calanus finmarchicus in the North Sea. This correlation was thought to have broken down in 1996, however, the timing has never been tested statistically. The present study revisits this unanticipated change and reveals that the correlation did not break down in 1996 as originally proposed but earlier, at the time of an abrupt ecosystem shift in the North Sea in the 1980s. Furthermore, the analyses demonstrate that the correlation between the NAO and C. finmarchicus abundance is modulated by the thermal regime of the North Sea, which in turn covaries positively with global temperature anomalies. This study thereby provides evidence that global climate change is likely to alter some empirical relationships found in the past between species abundance or the ecosystem state and large-scale natural sources of hydro-climatological variability. A theory is proposed to explain how this might happen. These unanticipated changes, also called ‘surprises’ in climatic research, are a direct consequence of the complexity of both climatic and biological systems. In this period of rapid climate change, it is therefore hazardous to integrate meteo-oceanic indices such as the NAO in models used in the management of living resources, as it has been sometimes attempted in the past.
Resumo:
Ecosystem-based approaches (EBAs) to managing anthropogenic pressures on ecosystems, adapting to changes in ecosystem states (indicators of ecosystem health), and mitigating the impacts of state changes on ecosystem services are needed for sustainable development. EBAs are informed by integrated ecosystem assessments (IEAs) that must be compiled and updated frequently for EBAs to be effective. Frequently updated IEAs depend on the sustained provision of data and information on pressures, state changes, and impacts of state changes on services. Nowhere is this truer than in the coastal zone, where people and ecosystem services are concentrated and where anthropogenic pressures converge. This study identifies the essential indicator variables required for the sustained provision of frequently updated IEAs, and offers an approach to establishing a global network of coastal observations within the framework of the Global Ocean Observing System. The need for and challenges of capacity-building are highlighted, and examples are given of current programmes that could contribute to the implementation of a coastal ocean observing system of systems on a global scale. This illustrates the need for new approaches to ocean governance that can achieve coordinated integration of existing programmes and technologies as a first step towards this goal.