891 resultados para Animals Database Management Systems


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oomycete diseases cause significant losses across a broad range of crop and aquaculture commodities worldwide. These losses can be greatly reduced by disease management practices steered by accurate and early diagnoses of pathogen presence. Determinations of disease potential can help guide optimal crop rotation regimes, varietal selections, targeted control measures, harvest timings and crop post-harvest handling. Pathogen detection prior to infection can also reduce the incidence of disease epidemics. Classical methods for the isolation of oomycete pathogens are normally deployed only after disease symptom appearance. These processes are often-time consuming, relying on culturing the putative pathogen(s) and the availability of expert taxonomic skills for accurate identification; a situation that frequently results in either delayed application, or routine ‘blanket’ over-application of control measures. Increasing concerns about pesticides in the environment and the food chain, removal or restriction of their usage combined with rising costs have focussed interest in the development and improvement of disease management systems. To be effective, these require timely, accurate and preferably quantitatve diagnoses. A wide range of rapid diagnostic tools, from point of care immunodiagnostic kits to next generation nucleotide sequencing have potential application in oomycete disease management. Here we review currently-available as well as promising new technologies in the context of commercial agricultural production systems, considering the impacts of specific biotic and abiotic and other important factors such as speed and ease of access to information and cost effectiveness

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Strategy is a highly topical subject among managers and since the world is constantlychanging it is also an important subject for companies’ competitive advantage and survival.At the same time experts in the field of strategic management describe western techniques ascomplex and ineffective while the Japanese techniques have been seen as unambiguous andcharacterized by focus on quality, productivity and teamwork. This calls for greaterknowledge in the Japanese management systems. Hoshin Kanri is a collection of Japanesebest strategic management practices and therefore an interesting target for our study. Thus, onthe one hand this study investigates the theory of Hoshin Kanri in order to give structure to itand provide a way for practitioner into the management system. On the other hand this studyinvestigates Hoshin Kanri in order to reveal how Japanese subsidiaries based in Sweden haveimplemented this strategic management system. This is firstly done by reviewing the existingliterature on the subject and secondly by a collective case study with in-depth interviewsconducted with managers at Japanese owned subsidiaries based in Sweden. There are somelimitations in this study. One is that the results of the study do not include all Japanesesubsidiaries in Sweden as not all companies participated in the study. Moreover, the study islimited by one individuals’ knowledge and perception of Hoshin Kanri in each of the threecompanies. The study contributes to the existing literature on the topic of Hoshin Kanri by;(1) structuring the literature and the existing models under one of two categories, namelycyclical or sequential; (2) providing a model that aims at making it more understandable andattractive for practitioner to apply; (3) initiating the mapping of the spread of Hoshin Kanriamong Japanese subsidiaries in Sweden and (4) providing a Swedish model for theapplication of HK in Japanese subsidiaries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The red deer (Cervus elaphus) is currently one of the most widespread and abundant wild ungulates in the Iberian Peninsula and is extremely important both ecologically, as a key species for the functioning of the ecosystems, and economically, as a major game species. In Iberia, red deer populations are subjected to different management systems that may affect the physical condition of the individuals, with further consequences for population dynamics. Studies investigating the effects of management practices and environmental conditions on the performance of red deer are still rare regarding Mediterranean ecosystems. Much of the knowledge concerning the ecology of red deer and the impact of management on its physical condition is based on studies conducted in northern and central regions of Europe, where climatological features and management practices differ from those observed in the Mediterranean areas of Iberia. Studies on a biogeographical scale can provide important insights into the relationships between species and a particular environment and contribute to the development of more targeted and appropriate management practices. The optimisation of sampling procedures and the fine-tuning of pre-existing analytical techniques are also fundamental to a more cost-effective monitoring and, therefore, are of enormous value to wildlife managers. In this context, the main aims of this thesis were: 1) to optimise the procedures used to assess the physical condition of red deer; and 2) to identify relevant management and environmental factors affecting the nutritional condition and stress physiology of red deer in the Mediterranean ecosystems of Iberia, as well as any potential interactions between those factors. Two studies with a methodological focus, presented in the first part of the thesis, demonstrated that the physical condition of red deer can be evaluated more simply, using more cost- and time-effective procedures than those traditionally used: i) it was shown that only one kidney and its associated fat is enough to assess nutritional condition in red deer; and ii) the feasibility of using near infrared spectroscopy to predict the concentrations of stress hormone metabolites was demonstrated using faeces of red deer for the first time. Subsequently, two large-scale observational studies, conducted in representative red deer populations found in Mediterranean Iberia, highlighted the importance of considering seasonal environmental variations and variables related to hunting management practices to better understand the nutritional and physiological ecology of red deer. High population densities had adverse effects on the nutritional condition of the deer and were associated with increased stress levels in natural populations without supplementary feeding. Massive hunting events involving the use of hounds were also identified as a potential source of chronic stress in red deer. The research presented in this thesis has clear implications regarding the management and monitoring of red deer populations in Mediterranean environments and is intended to help wildlife managers to implement more effective monitoring programmes and sustainable management practices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tämän kandidaatintutkimuksen tarkoituksena on löytää vastaus siihen, miten vahva voi olla DRM-systeemi, ennen kuin kuluttajat eivät enää hyväksy sitä. DRM-systeemejä on monen tasoisia, mutta ne eivät ole soveltuvia sellaisenaan kaikille eri alustoille. Peliteollisuuden digitaalisten käyttöoikeuksien hallintajärjestelmillä on omanlaisensa lainalaisuudet kuin esimerkiksi musiikkiteollisuudella. Lisäksi on olemassa tietty tämän hetkinen hyväksytty DRM:n taso, josta voi olla vaarallista poiketa. Tutkimus on luonteeltaan laadullinen tutkimus. Työssä on sovellettu sekä diskurssi- että sisällönanalyysin oppeja. Tutkimuksen aineistona on erilaisten viestiketjujen tekstit, joiden pohjalta pyritään löytämään vastaus tutkimuskysymykseen. Ketjut on jaettu eri vahvuisiksi sen perusteella, miten vahva on DRM:ää koskeva uutinen, jonka pohjalta viestiketju on syntynyt. Koska aineisto on puhuttua kieltä ja sillä on aina oma merkityksensä kontekstissaan, ovat valitut menetelmät soveltuvia analysoimaan aineistoa. Eri ketjujen analyysien tuloksien pohjalta voidaan sanoa, että DRM ei voi olla sitä tasoa suurempi kuin mikä on sen hetkinen vallitseva taso. Jos tästä tasosta poiketaan pikkaisenkin, voi se aiheuttaa suurta närästystä kuluttajien keskuudessa, jopa siihen saakka, että yritys menettää tuloja. Sen hetkiseen tasoon on päästy erinäisten kokeilujen kautta, joista kuluttajat ovat kärsineet, joten he eivät suosiolla hyväksy yhtään sen suurempaa tasoa kuin mikä vallitsee sillä hetkellä. Jos yritys näkee, että tasoa on pakko tiukentaa, täytyy tiukennus tehdä pikkuhiljaa ja naamioida se lisäominaisuuksilla. Kuluttajat ovat tietoisia omista oikeuksistaan, eivätkä he helpolla halua luopua niistä yhtään sen enempää kuin on tarpeellista.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In today's fast-paced and interconnected digital world, the data generated by an increasing number of applications is being modeled as dynamic graphs. The graph structure encodes relationships among data items, while the structural changes to the graphs as well as the continuous stream of information produced by the entities in these graphs make them dynamic in nature. Examples include social networks where users post status updates, images, videos, etc.; phone call networks where nodes may send text messages or place phone calls; road traffic networks where the traffic behavior of the road segments changes constantly, and so on. There is a tremendous value in storing, managing, and analyzing such dynamic graphs and deriving meaningful insights in real-time. However, a majority of the work in graph analytics assumes a static setting, and there is a lack of systematic study of the various dynamic scenarios, the complexity they impose on the analysis tasks, and the challenges in building efficient systems that can support such tasks at a large scale. In this dissertation, I design a unified streaming graph data management framework, and develop prototype systems to support increasingly complex tasks on dynamic graphs. In the first part, I focus on the management and querying of distributed graph data. I develop a hybrid replication policy that monitors the read-write frequencies of the nodes to decide dynamically what data to replicate, and whether to do eager or lazy replication in order to minimize network communication and support low-latency querying. In the second part, I study parallel execution of continuous neighborhood-driven aggregates, where each node aggregates the information generated in its neighborhoods. I build my system around the notion of an aggregation overlay graph, a pre-compiled data structure that enables sharing of partial aggregates across different queries, and also allows partial pre-computation of the aggregates to minimize the query latencies and increase throughput. Finally, I extend the framework to support continuous detection and analysis of activity-based subgraphs, where subgraphs could be specified using both graph structure as well as activity conditions on the nodes. The query specification tasks in my system are expressed using a set of active structural primitives, which allows the query evaluator to use a set of novel optimization techniques, thereby achieving high throughput. Overall, in this dissertation, I define and investigate a set of novel tasks on dynamic graphs, design scalable optimization techniques, build prototype systems, and show the effectiveness of the proposed techniques through extensive evaluation using large-scale real and synthetic datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fisheries plays a significant and important part in the economy of the country contributing to foreign exchange, food security and employment creation. Lake Victoria contributes over 50% of the total annual fish catch. The purpose of fisheries management is to ensure conservation, protection, proper use, economic efficiency and equitable distribution of the fisheries resources both for the present and future generations through sustainable utilization. The earliest fisheries were mainly at the subsistence level. Fishing gear consisted of locally made basket traps, hooks and seine nets of papyrus. Fishing effort begun to increase with the introduction of more efficient flax gillnets in 1905. Fisheries management in Uganda started in 1914. Before then, the fishery was under some form of traditional management based on the do and don'ts. History shows that the Baganda had strong spiritual beliefs in respect of "god Mukasa" (god of the Lake) and these indirectly contributed to sustainable management of the lake. If a fisherman neglected to comply witt'l any of the ceremonies related to fishing he was expected to encounter a bad omen (Rev. Roscoe, 1965) However, with the introduction of the nylon gill nets, which could catch more fish, traditional management regime broke down. By 1955 the indigenous fish species like Oreochromis variabilis and Oreochromis esculentus had greatly declined in catches. Decline in catches led to introduction of poor fishing methods because of competition for fish. Government in an attempt to regulate the fishing irldustry enacted the first Fisheries Ordinance in 1951 and recruited Fisheries Officers to enforce them. The government put in place minimum net mesh-sizes and Fisheries Officers arrested fishermen without explaining the reason. This led to continued poor fishing practices. The development of government centred management systems led to increased alienation of resource users and to wilful disregard of specific regulations. The realisation of the problems faced by the central management system led to the recognition that user groups need to be actively involved in fisheries management if the systems are to be consistent with sustainable fisheries and be legitimate. Community participation in fisheries management under the Comanagement approach has been adopted in Lake Victoria including other water bodies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Applications are subject of a continuous evolution process with a profound impact on their underlining data model, hence requiring frequent updates in the applications' class structure and database structure as well. This twofold problem, schema evolution and instance adaptation, usually known as database evolution, is addressed in this thesis. Additionally, we address concurrency and error recovery problems with a novel meta-model and its aspect-oriented implementation. Modern object-oriented databases provide features that help programmers deal with object persistence, as well as all related problems such as database evolution, concurrency and error handling. In most systems there are transparent mechanisms to address these problems, nonetheless the database evolution problem still requires some human intervention, which consumes much of programmers' and database administrators' work effort. Earlier research works have demonstrated that aspect-oriented programming (AOP) techniques enable the development of flexible and pluggable systems. In these earlier works, the schema evolution and the instance adaptation problems were addressed as database management concerns. However, none of this research was focused on orthogonal persistent systems. We argue that AOP techniques are well suited to address these problems in orthogonal persistent systems. Regarding the concurrency and error recovery, earlier research showed that only syntactic obliviousness between the base program and aspects is possible. Our meta-model and framework follow an aspect-oriented approach focused on the object-oriented orthogonal persistent context. The proposed meta-model is characterized by its simplicity in order to achieve efficient and transparent database evolution mechanisms. Our meta-model supports multiple versions of a class structure by applying a class versioning strategy. Thus, enabling bidirectional application compatibility among versions of each class structure. That is to say, the database structure can be updated because earlier applications continue to work, as well as later applications that have only known the updated class structure. The specific characteristics of orthogonal persistent systems, as well as a metadata enrichment strategy within the application's source code, complete the inception of the meta-model and have motivated our research work. To test the feasibility of the approach, a prototype was developed. Our prototype is a framework that mediates the interaction between applications and the database, providing them with orthogonal persistence mechanisms. These mechanisms are introduced into applications as an {\it aspect} in the aspect-oriented sense. Objects do not require the extension of any super class, the implementation of an interface nor contain a particular annotation. Parametric type classes are also correctly handled by our framework. However, classes that belong to the programming environment must not be handled as versionable due to restrictions imposed by the Java Virtual Machine. Regarding concurrency support, the framework provides the applications with a multithreaded environment which supports database transactions and error recovery. The framework keeps applications oblivious to the database evolution problem, as well as persistence. Programmers can update the applications' class structure because the framework will produce a new version for it at the database metadata layer. Using our XML based pointcut/advice constructs, the framework's instance adaptation mechanism is extended, hence keeping the framework also oblivious to this problem. The potential developing gains provided by the prototype were benchmarked. In our case study, the results confirm that mechanisms' transparency has positive repercussions on the programmer's productivity, simplifying the entire evolution process at application and database levels. The meta-model itself also was benchmarked in terms of complexity and agility. Compared with other meta-models, it requires less meta-object modifications in each schema evolution step. Other types of tests were carried out in order to validate prototype and meta-model robustness. In order to perform these tests, we used an OO7 small size database due to its data model complexity. Since the developed prototype offers some features that were not observed in other known systems, performance benchmarks were not possible. However, the developed benchmark is now available to perform future performance comparisons with equivalent systems. In order to test our approach in a real world scenario, we developed a proof-of-concept application. This application was developed without any persistence mechanisms. Using our framework and minor changes applied to the application's source code, we added these mechanisms. Furthermore, we tested the application in a schema evolution scenario. This real world experience using our framework showed that applications remains oblivious to persistence and database evolution. In this case study, our framework proved to be a useful tool for programmers and database administrators. Performance issues and the single Java Virtual Machine concurrent model are the major limitations found in the framework.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The process of building Data Warehouses (DW) is well known with well defined stages but at the same time, mostly carried out manually by IT people in conjunction with business people. Web Warehouses (WW) are DW whose data sources are taken from the web. We define a flexible WW, which can be configured accordingly to different domains, through the selection of the web sources and the definition of data processing characteristics. A Business Process Management (BPM) System allows modeling and executing Business Processes (BPs) providing support for the automation of processes. To support the process of building flexible WW we propose a two BPs level: a configuration process to support the selection of web sources and the definition of schemas and mappings, and a feeding process which takes the defined configuration and loads the data into the WW. In this paper we present a proof of concept of both processes, with focus on the configuration process and the defined data.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Beef businesses in northern Australia are facing increased pressure to be productive and profitable with challenges such as climate variability and poor financial performance over the past decade. Declining terms of trade, limited recent gains in on-farm productivity, low profit margins under current management systems and current climatic conditions will leave little capacity for businesses to absorb climate change-induced losses. In order to generate a whole-of-business focus towards management change, the Climate Clever Beef project in the Maranoa-Balonne region of Queensland trialled the use of business analysis with beef producers to improve financial literacy, provide a greater understanding of current business performance and initiate changes to current management practices. Demonstration properties were engaged and a systematic approach was used to assess current business performance, evaluate impacts of management changes on the business and to trial practices and promote successful outcomes to the wider industry. Focus was concentrated on improving financial literacy skills, understanding the business’ key performance indicators and modifying practices to improve both business productivity and profitability. To best achieve the desired outcomes, several extension models were employed: the ‘group facilitation/empowerment model’, the ‘individual consultant/mentor model’ and the ‘technology development model’. Providing producers with a whole-of-business approach and using business analysis in conjunction with on-farm trials and various extension methods proved to be a successful way to encourage producers in the region to adopt new practices into their business, in the areas of greatest impact. The areas targeted for development within businesses generally led to improvements in animal performance and grazing land management further improving the prospects for climate resilience.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The research investigates the feasibility of using web-based project management systems for dredging. To achieve this objective the research assessed both the positive and negative aspects of using web-based technology for the management of dredging projects. Information gained from literature review and prior investigations of dredging projects revealed that project performance, social, political, technical, and business aspects of the organization were important factors in deciding to use web-based systems for the management of dredging projects. These factors were used to develop the research assumptions. An exploratory case study methodology was used to gather the empirical evidence and perform the analysis. An operational prototype of the system was developed to help evaluate developmental and functional requirements, as well as the influence on performance, and on the organization. The evidence gathered from three case study projects, and from a survey of 31 experts, were used to validate the assumptions. Baselines, representing the assumptions, were created as a reference to assess the responses and qualitative measures. The deviation of the responses was used to evaluate for the analysis. Finally, the conclusions were assessed by validating the assumptions with the evidence, derived from the analysis. The research findings are as follows: 1. The system would help improve project performance. 2. Resistance to implementation may be experienced if the system is implemented. Therefore, resistance to implementation needs to be investigated further and more R&D work is needed in order to advance to the final design and implementation. 3. System may be divided into standalone modules in order to simplify the system and facilitate incremental changes. 4. The QA/QC conceptual approach used by this research needs to be redefined during future R&D to satisfy both owners and contractors. Yin (2009) Case Study Research Design and Methods was used to develop the research approach, design, data collection, and analysis. Markus (1983) Resistance Theory was used during the assumptions definition to predict potential problems to the implementation of web-based project management systems for the dredging industry. Keen (1981) incremental changes and facilitative approach tactics were used as basis to classify solutions, and how to overcome resistance to implementation of the web-based project management system. Davis (1989) Technology Acceptance Model (TAM) was used to assess the solutions needed to overcome the resistances to the implementation of web-base management systems for dredging projects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change and carbon (C) sequestration are a major focus of research in the twenty-first century. Globally, soils store about 300 times the amount of C that is released per annum through the burning of fossil fuels (Schulze and Freibauer 2005). Land clearing and introduction of agricultural systems have led to rapid declines in soil C reserves. The recent introduction of conservation agricultural practices has not led to a reversing of the decline in soil C content, although it has minimized the rate of decline (Baker et al. 2007; Hulugalle and Scott 2008). Lal (2003) estimated the quantum of C pools in the atmosphere, terrestrial ecosystems, and oceans and reported a “missing C” component in the world C budget. Though not proven yet, this could be linked to C losses through runoff and soil erosion (Lal 2005) and a lack of C accounting in inland water bodies (Cole et al. 2007). Land management practices to minimize the microbial respiration and soil organic C (SOC) decline such as minimum tillage or no tillage were extensively studied in the past, and the soil erosion and runoff studies monitoring those management systems focused on other nutrients such as nitrogen (N) and phosphorus (P).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation investigates customer behavior modeling in service outsourcing and revenue management in the service sector (i.e., airline and hotel industries). In particular, it focuses on a common theme of improving firms’ strategic decisions through the understanding of customer preferences. Decisions concerning degrees of outsourcing, such as firms’ capacity choices, are important to performance outcomes. These choices are especially important in high-customer-contact services (e.g., airline industry) because of the characteristics of services: simultaneity of consumption and production, and intangibility and perishability of the offering. Essay 1 estimates how outsourcing affects customer choices and market share in the airline industry, and consequently the revenue implications from outsourcing. However, outsourcing decisions are typically endogenous. A firm may choose whether to outsource or not based on what a firm expects to be the best outcome. Essay 2 contributes to the literature by proposing a structural model which could capture a firm’s profit-maximizing decision-making behavior in a market. This makes possible the prediction of consequences (i.e., performance outcomes) of future strategic moves. Another emerging area in service operations management is revenue management. Choice-based revenue systems incorporate discrete choice models into traditional revenue management algorithms. To successfully implement a choice-based revenue system, it is necessary to estimate customer preferences as a valid input to optimization algorithms. The third essay investigates how to estimate customer preferences when part of the market is consistently unobserved. This issue is especially prominent in choice-based revenue management systems. Normally a firm only has its own observed purchases, while those customers who purchase from competitors or do not make purchases are unobserved. Most current estimation procedures depend on unrealistic assumptions about customer arriving. This study proposes a new estimation methodology, which does not require any prior knowledge about the customer arrival process and allows for arbitrary demand distributions. Compared with previous methods, this model performs superior when the true demand is highly variable.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introducción El mal sistema de eliminación de aguas grises y negras está asociado con enfermedades infecto-contagiosas, tanto, en el ser humano, los animales y las plantas. Objetivo Determinar los sistemas de manejo de las aguas residuales domésticas en las viviendas e instituciones de la parroquia rural San Pablo de Shaglli del cantón Santa Isabel, durante el 2014. Material y métodos Se realizó un estudio descriptivo en la población de 115 personas. Los datos se obtuvieron por encuesta y se analizaron con el Software SPSS versión 15. La edad fluctuó entre 18-86 años, y la mediana, de 45. El 51.3 % fueron hombres. El 63.5 % tenían instrucción primaria, el 3.5 %, analfabetos, y el 6.1 %, instrucción superior. El 52.2 % eliminan las aguas grises provenientes de la cocina al aire libre; mientras que el 55.7 % de aguas provenientes del aseo personal, también, se eliminan al aire libre, al igual que el 6.26 % de las aguas grises provenientes del lavado de la ropa, y el 47.0 % eliminan las aguas negras en el pozo séptico. Conclusión Más del 50.0 % eliminan las aguas grises provenientes de la cocina, del aseo personal, del lavado de ropa al aire libre, y un porcentaje similar elimina las aguas negras en el pozo séptico.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nowadays, Power grids are critical infrastructures on which everything else relies, and their correct behavior is of the highest priority. New smart devices are being deployed to be able to manage and control power grids more efficiently and avoid instability. However, the deployment of such smart devices like Phasor Measurement Units (PMU) and Phasor Data Concentrators (PDC), open new opportunities for cyber attackers to exploit network vulnerabilities. If a PDC is compromised, all data coming from PMUs to that PDC is lost, reducing network observability. Our approach to solve this problem is to develop an Intrusion detection System (IDS) in a Software-defined network (SDN). allowing the IDS system to detect compromised devices and use that information as an input for a self-healing SDN controller, which redirects the data of the PMUs to a new, uncompromised PDC, maintaining the maximum possible network observability at every moment. During this research, we have successfully implemented Self-healing in an example network with an SDN controller based on Ryu controller. We have also assessed intrinsic vulnerabilities of Wide Area Management Systems (WAMS) and SCADA networks, and developed some rules for the Intrusion Detection system which specifically protect vulnerabilities of these networks. The integration of the IDS and the SDN controller was also successful. \\To achieve this goal, the first steps will be to implement an existing Self-healing SDN controller and assess intrinsic vulnerabilities of Wide Area Measurement Systems (WAMS) and SCADA networks. After that, we will integrate the Ryu controller with Snort, and create the Snort rules that are specific for SCADA or WAMS systems and protocols.