899 resultados para ree software environment for statistical computing and graphics R


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Estuaries are areas which, from their structure, their fonctioning, and their localisation, are subject to significant contribution of nutrients. One of the objectif of the RNO, the French network for coastal water quality monitoring, is to assess the levels and trends of nutrient concentrations in estuaries. A linear model was used in order to describe and to explain the total dissolved nitrogen concentration evolution in the three most important estuaries on the Chanel-Atlantic front (Seine, Loire and Gironde). As a first step, the selection of a reliable data set was performed. Then total dissolved nitrogen evolution schemes in estuary environment were graphically studied, and allowed a resonable choice of covariables. The salinity played a major role in explaining nitrogen concentration variability in estuary, and dilution lines were proved to be a useful tool to detect outlying observations and to model the nitrogenlsalinity relation. Increasing trends were detected by the model, with a high magnitude in Seine, intermediate in Loire, and lower in Gironde. The non linear trend estimated in Loire and Seine estuaries could be due to important interannual variations as suggest in graphics. In the objective of the QUADRIGE database valorisation, a discussion on the statistical model, and on the RNO hydrological data sampling strategy, allowed to formulate suggestions towards a better exploitation of nutrient data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Happier employees are more productive. Organizations across industry, no doubt, try to improve their employees’ happiness with the objective to achieve higher profitability and company value. While this issue has drawn increasing attention in high tech and other industries, little is known about the happiness of project management professionals. More research is needed to explore the current situation of workplace happiness of project management professionals and the driving factors behind it. This thesis explores the workplace happiness (subjective well-being) of project management professionals based on the exploratory statistical analysis of a survey 225 professionals in the state of Maryland, conducted in October 2014. The thesis applies Structural Equation Modeling and multiple regression analysis to the dataset and shows no significant impact of gender, age, work experience, and some other demographic traits on workplace happiness, also named well-being. Statistically significant factors for workplace happiness include: creating pleasant work environment, promoting open organization and well-managed team, and good organization to work for. With respect to the reliability of self-reporting, the study finds that the comprehensive appraisal tool designed by Happiness Works and New Economics Foundation can give a more reliable happiness evaluation. Two key factors, i.e. career perspectives and free to be self, can help alleviate the overconfidence of workplace happiness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Well-designed marine protected area (MPA) networks can deliver a range of ecological, economic and social benefits, and so a great deal of research has focused on developing spatial conservation prioritization tools to help identify important areas. However, whilst these software tools are designed to identify MPA networks that both represent biodiversity and minimize impacts on stakeholders, they do not consider complex ecological processes. Thus, it is difficult to determine the impacts that proposed MPAs could have on marine ecosystem health, fisheries and fisheries sustainability. Using the eastern English Channel as a case study, this paper explores an approach to address these issues by identifying a series of MPA networks using the Marxan and Marxan with Zones conservation planning software and linking them with a spatially explicit ecosystem model developed in Ecopath with Ecosim. We then use these to investigate potential trade-offs associated with adopting different MPA management strategies. Limited-take MPAs, which restrict the use of some fishing gears, could have positive benefits for conservation and fisheries in the eastern English Channel, even though they generally receive far less attention in research on MPA network design. Our findings, however, also clearly indicate that no-take MPAs should form an integral component of proposed MPA networks in the eastern English Channel, as they not only result in substantial increases in ecosystem biomass, fisheries catches and the biomass of commercially valuable target species, but are fundamental to maintaining the sustainability of the fisheries. Synthesis and applications. Using the existing software tools Marxan with Zones and Ecopath with Ecosim in combination provides a powerful policy-screening approach. This could help inform marine spatial planning by identifying potential conflicts and by designing new regulations that better balance conservation objectives and stakeholder interests. In addition, it highlights that appropriate combinations of no-take and limited-take marine protected areas might be the most effective when making trade-offs between long-term ecological benefits and short-term political acceptability.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The mobile water hyacinth, which was produced in growth zones, especially Murchison bay, was mainly exported to three sheltered storage bays (Thruston, Hannington and Waiya). Between 1996 and May 1998, the mobile form of water hyacinth occupied about 800 ha in Thruston bay, 750 ha in Hannington bay and 140 ha in Waiya bay). Biological control weevils and other factors, including localised nutrient depletion, weakened the weed that was confined to the bays and it sunk around October 1998. The settling to the bottom of such huge quantities of organic matter its subsequent decomposition and the debris from this mass was likely to have environmental impacts on biotic communities (e.g. fish and invertebrate), physico-chemical conditions (water quality), and on socio-economic activities (e.g. at fish landings, water abstraction, and hydro-power generation points). Sunken water hyacinth debris could also affect nutrient levels in the water column and lead to reduction in the content of dissolved oxygen. The changes in nutrient dynamics and oxygen levels could affect algal productivity, invertebrate composition and fish communities. Socio-economic impacts of dead sunken weed were expected from debris deposited along the shoreline especially at fish landings, water abstraction and hydropower generation points. Therefore, environmental impact assessment studies were carried out between 1998 and 2002 in selected representative zones of Lake Victoria to identify the effects of the sunken water hyacinth biomass.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spent hydroprocessing catalysts (HPCs) are solid wastes generated in refinery industries and typically contain various hazardous metals, such as Co, Ni, and Mo. These wastes cannot be discharged into the environment due to strict regulations and require proper treatment to remove the hazardous substances. Various options have been proposed and developed for spent catalysts treatment; however, hydrometallurgical processes are considered efficient, cost-effective and environmentally-friendly methods of metal extraction, and have been widely employed for different metal uptake from aqueous leachates of secondary materials. Although there are a large number of studies on hazardous metal extraction from aqueous solutions of various spent catalysts, little information is available on Co, Ni, and Mo removal from spent NiMo hydroprocessing catalysts. In the current study, a solvent extraction process was applied to the spent HPC to specifically remove Co, Ni, and Mo. The spent HPC is dissolved in an acid solution and then the metals are extracted using three different extractants, two of which were aminebased and one which was a quaternary ammonium salt. The main aim of this study was to develop a hydrometallurgical method to remove, and ultimately be able to recover, Co, Ni, and Mo from the spent HPCs produced at the petrochemical plant in Come By Chance, Newfoundland and Labrador. The specific objectives of the study were: (1) characterization of the spent catalyst and the acidic leachate, (2) identifying the most efficient leaching agent to dissolve the metals from the spent catalyst; (3) development of a solvent extraction procedure using the amine-based extractants Alamine308, Alamine336 and the quaternary ammonium salt, Aliquat336 in toluene to remove Co, Ni, and Mo from the spent catalyst; (4) selection of the best reagent for Co, Ni, and Mo extraction based on the required contact time, required extractant concentration, as well as organic:aqueous ratio; and (5) evaluation of the extraction conditions and optimization of the metal extraction process using the Design Expersoftware. For the present study, a Central Composite Design (CCD) method was applied as the main method to design the experiments, evaluate the effect of each parameter, provide a statistical model, and optimize the extraction process. Three parameters were considered as the most significant factors affecting the process efficiency: (i) extractant concentration, (ii) the organic:aqueous ratio, and (iii) contact time. Metal extraction efficiencies were calculated based on ICP analysis of the pre- and post–leachates, and the process optimization was conducted with the aid of the Design Expersoftware. The obtained results showed that Alamine308 can be considered to be the most effective and suitable extractant for spent HPC examined in the study. Alamine308 is capable of removing all three metals to the maximum amounts. Aliquat336 was found to be not as effective, especially for Ni extraction; however, it is able to separate all of these metals within the first 10 min, unlike Alamine336, which required more than 35 min to do so. Based on the results of this study, a cost-effective and environmentally-friendly solventextraction process was achieved to remove Co, Ni, and Mo from the spent HPCs in a short amount of time and with the low extractant concentration required. This method can be tested and implemented for other hazardous metals from other secondary materials as well. Further investigation may be required; however, the results of this study can be a guide for future research on similar metal extraction processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Humans have a high ability to extract visual data information acquired by sight. Trought a learning process, which starts at birth and continues throughout life, image interpretation becomes almost instinctively. At a glance, one can easily describe a scene with reasonable precision, naming its main components. Usually, this is done by extracting low-level features such as edges, shapes and textures, and associanting them to high level meanings. In this way, a semantic description of the scene is done. An example of this, is the human capacity to recognize and describe other people physical and behavioral characteristics, or biometrics. Soft-biometrics also represents inherent characteristics of human body and behaviour, but do not allow unique person identification. Computer vision area aims to develop methods capable of performing visual interpretation with performance similar to humans. This thesis aims to propose computer vison methods which allows high level information extraction from images in the form of soft biometrics. This problem is approached in two ways, unsupervised and supervised learning methods. The first seeks to group images via an automatic feature extraction learning , using both convolution techniques, evolutionary computing and clustering. In this approach employed images contains faces and people. Second approach employs convolutional neural networks, which have the ability to operate on raw images, learning both feature extraction and classification processes. Here, images are classified according to gender and clothes, divided into upper and lower parts of human body. First approach, when tested with different image datasets obtained an accuracy of approximately 80% for faces and non-faces and 70% for people and non-person. The second tested using images and videos, obtained an accuracy of about 70% for gender, 80% to the upper clothes and 90% to lower clothes. The results of these case studies, show that proposed methods are promising, allowing the realization of automatic high level information image annotation. This opens possibilities for development of applications in diverse areas such as content-based image and video search and automatica video survaillance, reducing human effort in the task of manual annotation and monitoring.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As climate change continues to impact socio-ecological systems, tools that assist conservation managers to understand vulnerability and target adaptations are essential. Quantitative assessments of vulnerability are rare because available frameworks are complex and lack guidance for dealing with data limitations and integrating across scales and disciplines. This paper describes a semi-quantitative method for assessing vulnerability to climate change that integrates socio-ecological factors to address management objectives and support decision-making. The method applies a framework first adopted by the Intergovernmental Panel on Climate Change and uses a structured 10-step process. The scores for each framework element are normalized and multiplied to produce a vulnerability score and then the assessed components are ranked from high to low vulnerability. Sensitivity analyses determine which indicators most influence the analysis and the resultant decision-making process so data quality for these indicators can be reviewed to increase robustness. Prioritisation of components for conservation considers other economic, social and cultural values with vulnerability rankings to target actions that reduce vulnerability to climate change by decreasing exposure or sensitivity and/or increasing adaptive capacity. This framework provides practical decision-support and has been applied to marine ecosystems and fisheries, with two case applications provided as examples: (1) food security in Pacific Island nations under climate-driven fish declines, and (2) fisheries in the Gulf of Carpentaria, northern Australia. The step-wise process outlined here is broadly applicable and can be undertaken with minimal resources using existing data, thereby having great potential to inform adaptive natural resource management in diverse locations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Embedded software systems in vehicles are of rapidly increasing commercial importance for the automotive industry. Current systems employ a static run-time environment; due to the difficulty and cost involved in the development of dynamic systems in a high-integrity embedded control context. A dynamic system, referring to the system configuration, would greatly increase the flexibility of the offered functionality and enable customised software configuration for individual vehicles, adding customer value through plug-and-play capability, and increased quality due to its inherent ability to adjust to changes in hardware and software. We envisage an automotive system containing a variety of components, from a multitude of organizations, not necessarily known at development time. The system dynamically adapts its configuration to suit the run-time system constraints. This paper presents our vision for future automotive control systems that will be regarded in an EU research project, referred to as DySCAS (Dynamically Self-Configuring Automotive Systems). We propose a self-configuring vehicular control system architecture, with capabilities that include automatic discovery and inclusion of new devices, self-optimisation to best-use the processing, storage and communication resources available, self-diagnostics and ultimately self-healing. Such an architecture has benefits extending to reduced development and maintenance costs, improved passenger safety and comfort, and flexible owner customisation. Specifically, this paper addresses the following issues: The state of the art of embedded software systems in vehicles, emphasising the current limitations arising from fixed run-time configurations; and the benefits and challenges of dynamic configuration, giving rise to opportunities for self-healing, self-optimisation, and the automatic inclusion of users’ Consumer Electronic (CE) devices. Our proposal for a dynamically reconfigurable automotive software system platform is outlined and a typical use-case is presented as an example to exemplify the benefits of the envisioned dynamic capabilities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The increasing dependency of everyday life on mobile devices also increases the number and complexity of computing tasks to be supported by these devices. However, the inherent requirement of mobility restricts them from being resources rich both in terms of energy (battery capacity) and other computing resources such as processing capacity, memory and other resources. This thesis looks into cyber foraging technique of offloading computing tasks. Various experiments on android mobile devices are carried out to evaluate offloading benefits in terms of sustainability advantage, prolonging battery life and augmenting the performance of mobile devices. This thesis considers two scenarios of cyber foraging namely opportunistic offloading and competitive offloading. These results show that the offloading scenarios are important for both green computing and resource augmentation of mobile devices. A significant advantage in battery life gain and performance enhancement is obtained. Moreover, cyber foraging is proved to be efficient in minimizing energy consumption per computing tasks. The work is based on scavenger cyber foraging system. In addition, the work can be used as a basis for studying cyber foraging and other similar approaches such as mobile cloud/edge computing for internet of things devices and improving the user experiences of applications by minimizing latencies through the use of potential nearby surrogates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article will address the main technical aspects that facilitate the use and growth of computer technology in the cloud, which go hand in hand with the emergence of more and better services on the Internet and technological development of the broadband. Finally, we know what is the impact that the cloud computing technologies in the automation of information units.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This talk, which is based on our newest findings and experiences from research and industrial projects, addresses one of the most relevant challenges for a decade to come: How to integrate the Internet of Things with software, people, and processes, considering modern Cloud Computing and Elasticity principles. Elasticity is seen as one of the main characteristics of Cloud Computing today. Is elasticity simply scalability on steroids? This talk addresses the main principles of elasticity, presents a fresh look at this problem, and examines how to integrate people, software services, and things into one composite system, which can be modeled, programmed, and deployed on a large scale in an elastic way. This novel paradigm has major consequences on how we view, build, design, and deploy ultra-large scale distributed systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since the emergence of software engineering in the late 1960's as a response to the software crisis, researchers throughout the world are trying to give theoretical support to this discipline. Several points of view have to be reviewed in order to complete this task. In the middle 70's Frederick Brooks Jr. coined the term "silver bullet" suggesting the solution to several problems rela-ted to software engineering and, hence, we adopted such a metaphor as a symbol for this book. Methods, modeling, and teaching are the insights reviewed in this book. Some work related to these topies is presented by software engineering researchers, led by Ivar Jacobson, one of the most remarkable researchers in this area. We hope our work will contribute to advance in giving the theoretieal support that software engineering needs.