930 resultados para Integrated farming systems


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper discusses demand and supply chain management and examines how artificial intelligence techniques and RFID technology can enhance the responsiveness of the logistics workflow. This proposed system is expected to have a significant impact on the performance of logistics networks by virtue of its capabilities to adapt unexpected supply and demand changes in the volatile marketplace with the unique feature of responsiveness with the advanced technology, Radio Frequency Identification (RFID). Recent studies have found that RFID and artificial intelligence techniques drive the development of total solution in logistics industry. Apart from tracking the movement of the goods, RFID is able to play an important role to reflect the inventory level of various distribution areas. In today’s globalized industrial environment, the physical logistics operations and the associated flow of information are the essential elements for companies to realize an efficient logistics workflow scenario. Basically, a flexible logistics workflow, which is characterized by its fast responsiveness in dealing with customer requirements through the integration of various value chain activities, is fundamental to leverage business performance of enterprises. The significance of this research is the demonstration of the synergy of using a combination of advanced technologies to form an integrated system that helps achieve lean and agile logistics workflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary aim of this research is to understand what constitutes management accounting and control (MACs) practice and how these control processes are implicated in the day to day work practices and operations of the organisation. It also examines the changes that happen in MACs practices over time as multiple actors within organisational settings interact with each other. I adopt a distinctive practice theory approach (i.e. sociomateriality) and the concept of imbrication in this research to show that MACs practices emerge from the entanglement between human/social agency and material/technological agency within an organisation. Changes in the pattern of MACs practices happens in imbrication processes which are produced as the two agencies entangle. The theoretical approach employed in this research offers an interesting and valuable lens which seeks to reveal the depth of these interactions and uncover the way in which the social and material imbricate. The theoretical framework helps to reveal how these constructions impact on and produce modifications of MACs practices. The exploration of the control practices at different hierarchical levels (i.e. from the operational to middle management and senior level management) using the concept of imbrication process also maps the dynamic flow of controls from operational to top management and vice versa in the organisation. The empirical data which is the focus of this research has been gathered from a case study of an organisation involved in a large vertically integrated palm oil industry company in Malaysia specifically the refinery sector. The palm oil industry is a significant industry in Malaysia as it contributed an average of 4.5% of Malaysian Gross Domestic Product, over the period 1990 -2010. The Malaysian palm oil industry also has a significant presence in global food oil supply where it contributed 26% of the total oils and fats global trade in 2010. The case organisation is a significant contributor to the Malaysian palm oil industry. The research access has provided an interesting opportunity to explore the interactions between different groups of people and material/technology in a relatively heavy process food industry setting. My research examines how these interactions shape and are shaped by control practices in a dynamic cycle of imbrications over both short and medium time periods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The international economic and business environment continues to develop at a rapid rate. Increasing interactions between economies, particularly between Europe and Asia, has raised many important issues regarding transport infrastructure, logistics and broader supply chain management. The potential exists to further stimulate trade provided that these issues are addressed in a logical and systematic manner. However, if this potential is to be realised in practice there is a need to re-evaluate current supply chain configurations. A mismatch currently exists between the technological capability and the supply chain or logistical reality. This mismatch has sharpened the focus on the need for robust approaches to supply chain re-engineering. Traditional approaches to business re-engineering have been based on manufacturing systems engineering and business process management. A recognition that all companies exist as part of bigger supply chains has fundamentally changed the focus of re-engineering. Inefficiencies anywhere in a supply chain result in the chain as a whole being unable to reach its true competitive potential. This reality, combined with the potentially radical impact on business and supply chain architectures of the technologies associated with electronic business, requires organisations to adopt innovative approaches to supply chain analysis and re-design. This paper introduces a systems approach to supply chain re-engineering which is aimed at addressing the challenges which the evolving business environment brings with it. The approach, which is based on work with a variety of both conventional and electronic supply chains, comprises underpinning principles, a methodology and guidelines on good working practice, as well as a suite of tools and techniques. The adoption of approaches such as that outlined in this paper helps to ensure that robust supply chains are designed and implemented in practice. This facilitates an integrated approach, with involvement of all key stakeholders throughout the design process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Protein pKa Database (PPD) v1.0 provides a compendium of protein residue-specific ionization equilibria (pKa values), as collated from the primary literature, in the form of a web-accessible postgreSQL relational database. Ionizable residues play key roles in the molecular mechanisms that underlie many biological phenomena, including protein folding and enzyme catalysis. The PPD serves as a general protein pKa archive and as a source of data that allows for the development and improvement of pKa prediction systems. The database is accessed through an HTML interface, which offers two fast, efficient search methods: an amino acid-based query and a Basic Local Alignment Search Tool search. Entries also give details of experimental techniques and links to other key databases, such as National Center for Biotechnology Information and the Protein Data Bank, providing the user with considerable background information.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Dubrovin type equations for the N -gap solution of a completely integrable system associated with a polynomial pencil is constructed and then integrated to a system of functional equations. The approach used to derive those results is a generalization of the familiar process of finding the 1-soliton (1-gap) solution by integrating the ODE obtained from the soliton equation via the substitution u = u(x + λt).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A tilted fiber Bragg grating (TFBG) was integrated as the dispersive element in a high performance biomedical imaging system. The spectrum emitted by the 23 mm long active region of the fiber is projected through custom designed optics consisting of a cylindrical lens for vertical beam collimation and successively by an achromatic doublet onto a linear detector array. High resolution tomograms of biomedical samples were successfully acquired by the frequency domain OCT-system. Tomograms of ophthalmic and dermal samples obtained by the frequency domain OCT-system were obtained achieving 2.84 μm axial and 10.2 μm lateral resolution. The miniaturization reduces costs and has the potential to further extend the field of application for OCT-systems in biology, medicine and technology. © 2014 SPIE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital information services are gradually becoming integrated with other systems and services such as library automation systems, student information services, and electronic learning systems. Users demand seamless access to a multitude of digital information services without leaving their desktop computers. They prefer using systems that recognize them when they log on, acknowledge their rights and privileges, and thus provide personalized information services. This paper summarizes the recent developments concerning integrated and personalized digital information services. It first emphasizes the role of the Internet in providing information services and then goes on to discuss the integration and personalization issues by emphasizing their importance for digital information services.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The demands towards the contemporary information systems are constantly increasing. In a dynamic business environment an organization has to be prepared for sudden growth, shrinking or other type of reorganization. Such change would bring the need of adaptation of the information system, servicing the company. The association of access rights to parts of the system with users, groups of users, user roles etc. is of great importance to defining the different activities in the company and the restrictions of the access rights for each employee, according to his status. The mechanisms for access rights management in a system are taken in account during the system design. In most cases they are build in the system. This paper offers an approach in user rights framework development that is applicable in information systems. This work presents a reusable extendable mechanism that can be integrated in information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a new, dynamic feature representation method for high value parts consisting of complex and intersecting features. The method first extracts features from the CAD model of a complex part. Then the dynamic status of each feature is established between various operations to be carried out during the whole manufacturing process. Each manufacturing and verification operation can be planned and optimized using the real conditions of a feature, thus enhancing accuracy, traceability and process control. The dynamic feature representation is complementary to the design models used as underlining basis in current CAD/CAM and decision support systems. © 2012 CIRP.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new topology of the high frequency alternating current (HFAC) inverter bridge arm is proposed which comprises a coupled inductor, a switching device and an active clamp circuit. Based on it, new single-phase and threephase inverters are proposed and their operating states are analysed along with the traditional H-bridge inverter. Multiphase and multi-level isolated inverters are also developed using the HFAC bridge arm. Furthermore, based on the proposed HFAC, a front-end DC-DC converter is also developed for photovoltaic systems to demonstrate the application of the proposed HFAC converter. Simulation and experimental results from prototype converters are carried out to validate the proposed topologies which can be utilised widely in high frequency power conversion applications such as induction heating and wireless power transfer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, databases have become an integral part of information systems. In the past two decades, we have seen different database systems being developed independently and used in different applications domains. Today's interconnected networks and advanced applications, such as data warehousing, data mining & knowledge discovery and intelligent data access to information on the Web, have created a need for integrated access to such heterogeneous, autonomous, distributed database systems. Heterogeneous/multidatabase research has focused on this issue resulting in many different approaches. However, a single, generally accepted methodology in academia or industry has not emerged providing ubiquitous intelligent data access from heterogeneous, autonomous, distributed information sources. ^ This thesis describes a heterogeneous database system being developed at High-performance Database Research Center (HPDRC). A major impediment to ubiquitous deployment of multidatabase technology is the difficulty in resolving semantic heterogeneity. That is, identifying related information sources for integration and querying purposes. Our approach considers the semantics of the meta-data constructs in resolving this issue. The major contributions of the thesis work include: (i) providing a scalable, easy-to-implement architecture for developing a heterogeneous multidatabase system, utilizing Semantic Binary Object-oriented Data Model (Sem-ODM) and Semantic SQL query language to capture the semantics of the data sources being integrated and to provide an easy-to-use query facility; (ii) a methodology for semantic heterogeneity resolution by investigating into the extents of the meta-data constructs of component schemas. This methodology is shown to be correct, complete and unambiguous; (iii) a semi-automated technique for identifying semantic relations, which is the basis of semantic knowledge for integration and querying, using shared ontologies for context-mediation; (iv) resolutions for schematic conflicts and a language for defining global views from a set of component Sem-ODM schemas; (v) design of a knowledge base for storing and manipulating meta-data and knowledge acquired during the integration process. This knowledge base acts as the interface between integration and query processing modules; (vi) techniques for Semantic SQL query processing and optimization based on semantic knowledge in a heterogeneous database environment; and (vii) a framework for intelligent computing and communication on the Internet applying the concepts of our work. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main challenges of multimedia data retrieval lie in the effective mapping between low-level features and high-level concepts, and in the individual users' subjective perceptions of multimedia content. ^ The objectives of this dissertation are to develop an integrated multimedia indexing and retrieval framework with the aim to bridge the gap between semantic concepts and low-level features. To achieve this goal, a set of core techniques have been developed, including image segmentation, content-based image retrieval, object tracking, video indexing, and video event detection. These core techniques are integrated in a systematic way to enable the semantic search for images/videos, and can be tailored to solve the problems in other multimedia related domains. In image retrieval, two new methods of bridging the semantic gap are proposed: (1) for general content-based image retrieval, a stochastic mechanism is utilized to enable the long-term learning of high-level concepts from a set of training data, such as user access frequencies and access patterns of images. (2) In addition to whole-image retrieval, a novel multiple instance learning framework is proposed for object-based image retrieval, by which a user is allowed to more effectively search for images that contain multiple objects of interest. An enhanced image segmentation algorithm is developed to extract the object information from images. This segmentation algorithm is further used in video indexing and retrieval, by which a robust video shot/scene segmentation method is developed based on low-level visual feature comparison, object tracking, and audio analysis. Based on shot boundaries, a novel data mining framework is further proposed to detect events in soccer videos, while fully utilizing the multi-modality features and object information obtained through video shot/scene detection. ^ Another contribution of this dissertation is the potential of the above techniques to be tailored and applied to other multimedia applications. This is demonstrated by their utilization in traffic video surveillance applications. The enhanced image segmentation algorithm, coupled with an adaptive background learning algorithm, improves the performance of vehicle identification. A sophisticated object tracking algorithm is proposed to track individual vehicles, while the spatial and temporal relationships of vehicle objects are modeled by an abstract semantic model. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Choosing between Light Rail Transit (LRT) and Bus Rapid Transit (BRT) systems is often controversial and not an easy task for transportation planners who are contemplating the upgrade of their public transportation services. These two transit systems provide comparable services for medium-sized cities from the suburban neighborhood to the Central Business District (CBD) and utilize similar right-of-way (ROW) categories. The research is aimed at developing a method to assist transportation planners and decision makers in determining the most feasible system between LRT and BRT. ^ Cost estimation is a major factor when evaluating a transit system. Typically, LRT is more expensive to build and implement than BRT, but has significantly lower Operating and Maintenance (OM) costs than BRT. This dissertation examines the factors impacting capacity and costs, and develops cost models, which are a capacity-based cost estimate for the LRT and BRT systems. Various ROW categories and alignment configurations of the systems are also considered in the developed cost models. Kikuchi's fleet size model (1985) and cost allocation method are used to develop the cost models to estimate the capacity and costs. ^ The comparison between LRT and BRT are complicated due to many possible transportation planning and operation scenarios. In the end, a user-friendly computer interface integrated with the established capacity-based cost models, the LRT and BRT Cost Estimator (LBCostor), was developed by using Microsoft Visual Basic language to facilitate the process and will guide the users throughout the comparison operations. The cost models and the LBCostor can be used to analyze transit volumes, alignments, ROW configurations, number of stops and stations, headway, size of vehicle, and traffic signal timing at the intersections. The planners can make the necessary changes and adjustments depending on their operating practices. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software development is an extremely complex process, during which human errors are introduced and result in faulty software systems. It is highly desirable and important that these errors can be prevented and detected as early as possible. Software architecture design is a high-level system description, which embodies many system features and properties that are eventually implemented in the final operational system. Therefore, methods for modeling and analyzing software architecture descriptions can help prevent and reveal human errors and thus improve software quality. Furthermore, if an analyzed software architecture description can be used to derive a partial software implementation, especially when the derivation can be automated, significant benefits can be gained with regard to both the system quality and productivity. This dissertation proposes a framework for an integrated analysis on both of the design and implementation. To ensure the desirable properties of the architecture model, we apply formal verification by using the model checking technique. To ensure the desirable properties of the implementation, we develop a methodology and the associated tool to translate an architecture specification into an implementation written in the combination of Arch-Java/Java/AspectJ programming languages. The translation is semi-automatic so that many manual programming errors can be prevented. Furthermore, the translation inserting monitoring code into the implementation such that runtime verification can be performed, this provides additional assurance for the quality of the implementation. Moreover, validations for the translations from architecture model to program are provided. Finally, several case studies are experimented and presented.