798 resultados para Management techniques
Resumo:
Division of Fisheries, Illinois Department of Natural Resources Grant/Contract No: F-123 R-14
Resumo:
Division of Fisheries, Illinois Department of Natural Resources Grant/Contract No: F-123 R-13
Resumo:
Excess nutrient loads carried by streams and rivers are a great concern for environmental resource managers. In agricultural regions, excess loads are transported downstream to receiving water bodies, potentially causing algal blooms, which could lead to numerous ecological problems. To better understand nutrient load transport, and to develop appropriate water management plans, it is important to have accurate estimates of annual nutrient loads. This study used a Monte Carlo sub-sampling method and error-corrected statistical models to estimate annual nitrate-N loads from two watersheds in central Illinois. The performance of three load estimation methods (the seven-parameter log-linear model, the ratio estimator, and the flow-weighted averaging estimator) applied at one-, two-, four-, six-, and eight-week sampling frequencies were compared. Five error correction techniques; the existing composite method, and four new error correction techniques developed in this study; were applied to each combination of sampling frequency and load estimation method. On average, the most accurate error reduction technique, (proportional rectangular) resulted in 15% and 30% more accurate load estimates when compared to the most accurate uncorrected load estimation method (ratio estimator) for the two watersheds. Using error correction methods, it is possible to design more cost-effective monitoring plans by achieving the same load estimation accuracy with fewer observations. Finally, the optimum combinations of monitoring threshold and sampling frequency that minimizes the number of samples required to achieve specified levels of accuracy in load estimation were determined. For one- to three-weeks sampling frequencies, combined threshold/fixed-interval monitoring approaches produced the best outcomes, while fixed-interval-only approaches produced the most accurate results for four- to eight-weeks sampling frequencies.
Resumo:
This thesis examines the importance of effective stakeholder engagement that complies with the doctrines of social justice in non-renewable resources management decision-making. It uses hydraulic fracturing in the Green Point Shale Formation in Western Newfoundland as a case study. The thesis uses as theoretical background John Rawls’ and David Miller’ theory of social justice, and identifies the social justice principles, which are relevant to stakeholder engagement. The thesis compares the method of stakeholder engagement employed by the Newfoundland and Labrador Hydraulic Fracturing Review Panel (NLHFRP), with the stakeholder engagement techniques recommended by the Structured Decision Making (SDM) model, as applied to a simulated case study involving hydraulic fracturing in the Green Point Shale Formation. Using the already identified social justice principles, the thesis then developed a framework to measure the level of compliance of both stakeholder engagement techniques with social justice principles. The main finding of the thesis is that the engagement techniques prescribed by the SDM model comply more closely with the doctrines of social justice than the engagement techniques applied by the NLHFRP. The thesis concludes by recommending that the SDM model be more widely used in non- renewable resource management decision making in order to ensure that all stakeholders’ concerns are effectively heard, understood and transparently incorporated in the nonrenewable resource policies to make them consistent with local priorities and goals, and with the social justice norms and institutions.
Resumo:
Employees are the human capital which, to a great extent, contributes to the success and development of high-performance and sustainable organizations. In a work environment, there is a need to provide a tool for tracking and following-up on each employees' professional progress, while staying aligned with the organization’s strategic and operational goals and objectives. The research work within this Thesis aims to contribute to improve employees' selfawareness and auto-regulation; two predominant research areas are also studied and analyzed: Visual Analytics and Gamification. The Visual Analytics enables the specification of personalized dashboard interfaces with alerts and indicators to keep employees aware of their skills and to continuously monitor how to improve their expertise, promoting simultaneously behavioral change and adoption of good-practices. The study of Gamification techniques with Talent Management features enabled the design of new processes to engage, motivate, and retain highly productive employees, and to foster a competitive working environment, where employees are encouraged to be involved in new and rewarding activities, where knowledge and experience are recognized as a relevant asset. The Design Science Research was selected as the research methodology; the creation of new knowledge is therefore based on an iterative cycle addressing concepts such as design, analysis, reflection, and abstraction. By collaborating in an international project (Active@Work), funded by the Active and Assisted Living Programme, the results followed a design thinking approach regarding the specification of the structure and behavior of the Skills Development Module, namely the identification of requirements and the design of an innovative info-structure of metadata to support the user experience. A set of mockups were designed based on the user role and main concerns. Such approach enabled the conceptualization of a solution to proactively assist the management and assessment of skills in a personalized and dynamic way. The outcomes of this Thesis aims to demonstrate the existing articulation between emerging research areas such as Visual Analytics and Gamification, expecting to represent conceptual gains in these two research fields.
Resumo:
Early water resources modeling efforts were aimed mostly at representing hydrologic processes, but the need for interdisciplinary studies has led to increasing complexity and integration of environmental, social, and economic functions. The gradual shift from merely employing engineering-based simulation models to applying more holistic frameworks is an indicator of promising changes in the traditional paradigm for the application of water resources models, supporting more sustainable management decisions. This dissertation contributes to application of a quantitative-qualitative framework for sustainable water resources management using system dynamics simulation, as well as environmental systems analysis techniques to provide insights for water quality management in the Great Lakes basin. The traditional linear thinking paradigm lacks the mental and organizational framework for sustainable development trajectories, and may lead to quick-fix solutions that fail to address key drivers of water resources problems. To facilitate holistic analysis of water resources systems, systems thinking seeks to understand interactions among the subsystems. System dynamics provides a suitable framework for operationalizing systems thinking and its application to water resources problems by offering useful qualitative tools such as causal loop diagrams (CLD), stock-and-flow diagrams (SFD), and system archetypes. The approach provides a high-level quantitative-qualitative modeling framework for "big-picture" understanding of water resources systems, stakeholder participation, policy analysis, and strategic decision making. While quantitative modeling using extensive computer simulations and optimization is still very important and needed for policy screening, qualitative system dynamics models can improve understanding of general trends and the root causes of problems, and thus promote sustainable water resources decision making. Within the system dynamics framework, a growth and underinvestment (G&U) system archetype governing Lake Allegan's eutrophication problem was hypothesized to explain the system's problematic behavior and identify policy leverage points for mitigation. A system dynamics simulation model was developed to characterize the lake's recovery from its hypereutrophic state and assess a number of proposed total maximum daily load (TMDL) reduction policies, including phosphorus load reductions from point sources (PS) and non-point sources (NPS). It was shown that, for a TMDL plan to be effective, it should be considered a component of a continuous sustainability process, which considers the functionality of dynamic feedback relationships between socio-economic growth, land use change, and environmental conditions. Furthermore, a high-level simulation-optimization framework was developed to guide watershed scale BMP implementation in the Kalamazoo watershed. Agricultural BMPs should be given priority in the watershed in order to facilitate cost-efficient attainment of the Lake Allegan's TP concentration target. However, without adequate support policies, agricultural BMP implementation may adversely affect the agricultural producers. Results from a case study of the Maumee River basin show that coordinated BMP implementation across upstream and downstream watersheds can significantly improve cost efficiency of TP load abatement.
Resumo:
This thesis presents a study of the Grid data access patterns in distributed analysis in the CMS experiment at the LHC accelerator. This study ranges from the deep analysis of the historical patterns of access to the most relevant data types in CMS, to the exploitation of a supervised Machine Learning classification system to set-up a machinery able to eventually predict future data access patterns - i.e. the so-called dataset “popularity” of the CMS datasets on the Grid - with focus on specific data types. All the CMS workflows run on the Worldwide LHC Computing Grid (WCG) computing centers (Tiers), and in particular the distributed analysis systems sustains hundreds of users and applications submitted every day. These applications (or “jobs”) access different data types hosted on disk storage systems at a large set of WLCG Tiers. The detailed study of how this data is accessed, in terms of data types, hosting Tiers, and different time periods, allows to gain precious insight on storage occupancy over time and different access patterns, and ultimately to extract suggested actions based on this information (e.g. targetted disk clean-up and/or data replication). In this sense, the application of Machine Learning techniques allows to learn from past data and to gain predictability potential for the future CMS data access patterns. Chapter 1 provides an introduction to High Energy Physics at the LHC. Chapter 2 describes the CMS Computing Model, with special focus on the data management sector, also discussing the concept of dataset popularity. Chapter 3 describes the study of CMS data access patterns with different depth levels. Chapter 4 offers a brief introduction to basic machine learning concepts and gives an introduction to its application in CMS and discuss the results obtained by using this approach in the context of this thesis.
Resumo:
Urban inequality has emerged as one of the dominant themes of modern life and globalization. More than three million people experienced homelessness in the United States last year; in Miami-Dade, more than 15,000 individuals were homeless. Surviving extreme poverty, and exiting or avoiding homelessness, involves negotiating a complex mix of public and private assistance. However, a range of factors influence what types of help are available and how they can be accessed. Frequently, larger social structures determine which resource are available, leaving many choices entirely out of the individual’s control. For single men, who are ineligible for many benefits, homelessness can be difficult to avoid and even harder to exit. My study seeks to better understand how adult, minority men living in extreme poverty in Miami-Dade negotiate their daily survival. Specific research questions address: Do black and Hispanic men who are homeless or at risk of homelessness have different personal characteristics and different experiences in avoiding or exiting homelessness? How does Miami’s response to extreme poverty/homelessness, including availability of public benefits and public and private service organizations, either maximize or constrain the choices available to this population? And, what is the actual experience of single, adult men who are homeless or at risk of homelessness, in negotiating their daily survival? A mixed methods approach combines quantitative survey data from 7,605 homeless men, with qualitative data from 54 semi-structured interviews incorporating the visual ethnography techniques of Photo Elicitation Interviewing. Results show the differences experienced by black and Hispanic men who are poor and homeless in Miami. Findings also highlight how the community’s official and unofficial responses to homelessness intersect with the actual experiences of the persons targeted by the policies and programs, challenging preconceived notions regarding the lives of persons living in extreme poverty. It adds to the existing body of literature by focusing on the urban Miami context, emphasizing disparities amongst racial and ethnic groups. Findings are intended to provide an empirically grounded thesis that humanizes the subjects and illuminates their personal experiences, helping to inform public policy around the needs of extremely poor populations.
Resumo:
The current study examines the effects of an online workshop pertaining to classroom behavior management on teacher self-efficacy, attitudes, motivation, knowledge, and practices. In addition, information about teacher utilization of the Internet, their opinions about professional development, and experiences with classroom management were collected. Participants included 57 1 st through 5th grade special and regular education teachers. Eligible teachers were those who teach an academic subject and had at least one child in the classroom they considered as disruptive. Teachers were randomized to either a training or waitlist group. Classroom observations of teacher practices and questionnaires were utilized. Teachers in the training group participated in two assessment points, baseline and post-workshop, and received access to the online course immediately following the baseline assessment. Teachers in the waitlist group participated in three assessment points, baseline, post-workshop, and follow-up, and received access to the online course immediately following the post-workshop assessment. Findings show that all teachers had access to the Internet at home and at school and used it on a daily basis. The majority of teachers indicated having some past training on all the techniques that were presented in the online workshop. All teachers expressed satisfaction with the workshop, indicating that it should be offered again. Post-workshop, findings showed significant group differences in knowledge with a large effect for the training group scoring higher than the waitlist group on a quiz. Secondly, group differences in self-efficacy, knowledge, and attitudes with teachers’ past-training as a moderator, was examined. Past-training was not found to be a significant moderator of self-efficacy, knowledge, or attitudes. However, the main effect for training group was significant for attitudes. In addition, teacher attitudes, but not knowledge and self-efficacy, significantly predicted motivation to implement. Next, the moderating effect of barriers on motivation and classroom management skill implementation was examined. Barriers were not found to be a significant moderator. Lastly, the training group was observed to be significantly more effective at giving commands compared to the waitlist group. The current study demonstrates the potential of a low-intensity online workshop on classroom management to enhance the accessibility of teacher professional development. ^
Resumo:
The rapid growth of virtualized data centers and cloud hosting services is making the management of physical resources such as CPU, memory, and I/O bandwidth in data center servers increasingly important. Server management now involves dealing with multiple dissimilar applications with varying Service-Level-Agreements (SLAs) and multiple resource dimensions. The multiplicity and diversity of resources and applications are rendering administrative tasks more complex and challenging. This thesis aimed to develop a framework and techniques that would help substantially reduce data center management complexity. We specifically addressed two crucial data center operations. First, we precisely estimated capacity requirements of client virtual machines (VMs) while renting server space in cloud environment. Second, we proposed a systematic process to efficiently allocate physical resources to hosted VMs in a data center. To realize these dual objectives, accurately capturing the effects of resource allocations on application performance is vital. The benefits of accurate application performance modeling are multifold. Cloud users can size their VMs appropriately and pay only for the resources that they need; service providers can also offer a new charging model based on the VMs performance instead of their configured sizes. As a result, clients will pay exactly for the performance they are actually experiencing; on the other hand, administrators will be able to maximize their total revenue by utilizing application performance models and SLAs. This thesis made the following contributions. First, we identified resource control parameters crucial for distributing physical resources and characterizing contention for virtualized applications in a shared hosting environment. Second, we explored several modeling techniques and confirmed the suitability of two machine learning tools, Artificial Neural Network and Support Vector Machine, to accurately model the performance of virtualized applications. Moreover, we suggested and evaluated modeling optimizations necessary to improve prediction accuracy when using these modeling tools. Third, we presented an approach to optimal VM sizing by employing the performance models we created. Finally, we proposed a revenue-driven resource allocation algorithm which maximizes the SLA-generated revenue for a data center.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Predictive models of species distributions are important tools for fisheries management. Unfortunately, these predictive models can be difficult to perform on large waterbodies where fish are difficult to detect and exhaustive sampling is not possible. In recent years the development of Geographic Information Systems (GIS) and new occupancy modelling techniques has improved our ability to predict distributions across landscapes as well as account for imperfect detection. I surveyed the nearshore fish community at 105 sites between Kingston, Ontario and Rockport, Ontario with the objective of modelling geographic and environmental characteristics associated with littoral fish distributions. Occupancy modelling was performed on Round Goby, Yellow perch, and Lepomis spp. Modelling with geographic and environmental covariates revealed the effect of shoreline exposure on nearshore habitat characteristics and the occupancy of Round Goby. Yellow Perch, and Lepomis spp. occupancy was most strongly associated negatively with distance to a wetland. These results are consistent with past research on large lake systems indicate the importance of wetlands and shoreline exposure in determining the fish community of the littoral zone. By examining 3 species with varying rates of occupancy and detection, this study was also able to demonstrate the variable utility of occupancy modelling.
Resumo:
Quality management provides to companies a framework to improve quality in overall systems, reduction of costs, reallocation of resources efficiently, correct planning of strategies, prevent or correct errors in the right time and increase the performance of companies. In this text, we discuss the different theories in this field, their obligatory or non-obligatory compliance, the importance of quality management for exporting companies and a case study of a Colombian firm that its main objective is to manage quality. In conclusion, we find out that there is different types of quality management systems such as Juran’s trilogy, Deming 14 points, Six sigma, HACCP, and so on; also that companies have to manage suppliers and that quality has a positive influence on exports volume; in the case of Colombian small and medium enterprises, it can be mentioned that the majority has implemented tools regarding quality management but is not enough.
Resumo:
Declarative techniques such as Constraint Programming can be very effective in modeling and assisting management decisions. We present a method for managing university classrooms which extends the previous design of a Constraint-Informed Information System to generate the timetables while dealing with spatial resource optimization issues. We seek to maximize space utilization along two dimensions: classroom use and occupancy rates. While we want to maximize the room use rate, we still need to satisfy the soft constraints which model students’ and lecturers’ preferences. We present a constraint logic programming-based local search method which relies on an evaluation function that combines room utilization and timetable soft preferences. Based on this, we developed a tool which we applied to the improvement of classroom allocation in a University. Comparing the results to the current timetables obtained without optimizing space utilization, the initial versions of our tool manages to reach a 30% improvement in space utilization, while preserving the quality of the timetable, both for students and lecturers.
Resumo:
In this work we analyze an optimal control problem for a system of two hydroelectric power stations in cascade with reversible turbines. The objective is to optimize the profit of power production while respecting the system’s restrictions. Some of these restrictions translate into state constraints and the cost function is nonconvex. This increases the complexity of the optimal control problem. The problem is solved numerically and two different approaches are adopted. These approaches focus on global optimization techniques (Chen-Burer algorithm) and on a projection estimation refinement method (PERmethod). PERmethod is used as a technique to reduce the dimension of the problem. Results and execution time of the two procedures are compared.