864 resultados para stream processing crowdsensing scheduling traffic analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

thesis is developed from a real life application of performance evaluation of small and medium-sized enterprises (SMEs) in Vietnam. The thesis presents two main methodological developments on evaluation of dichotomous environment variable impacts on technical efficiency. Taking into account the selection bias the thesis proposes a revised frontier separation approach for the seminal Data Envelopment Analysis (DEA) model which was developed by Charnes, Cooper, and Rhodes (1981). The revised frontier separation approach is based on a nearest neighbour propensity score matching pairing treated SMEs with their counterfactuals on the propensity score. The thesis develops order-m frontier conditioning on propensity score from the conditional order-m approach proposed by Cazals, Florens, and Simar (2002), advocated by Daraio and Simar (2005). By this development, the thesis allows the application of the conditional order-m approach with a dichotomous environment variable taking into account the existence of the self-selection problem of impact evaluation. Monte Carlo style simulations have been built to examine the effectiveness of the aforementioned developments. Methodological developments of the thesis are applied in empirical studies to evaluate the impact of training programmes on the performance of food processing SMEs and the impact of exporting on technical efficiency of textile and garment SMEs of Vietnam. The analysis shows that training programmes have no significant impact on the technical efficiency of food processing SMEs. Moreover, the analysis confirms the conclusion of the export literature that exporters are self selected into the sector. The thesis finds no significant impact from exporting activities on technical efficiency of textile and garment SMEs. However, large bias has been eliminated by the proposed approach. Results of empirical studies contribute to the understanding of the impact of different environmental variables on the performance of SMEs. It helps policy makers to design proper policy supporting the development of Vietnamese SMEs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes the followed methodology to automatically generate titles for a corpus of questions that belong to sociological opinion polls. Titles for questions have a twofold function: (1) they are the input of user searches and (2) they inform about the whole contents of the question and possible answer options. Thus, generation of titles can be considered as a case of automatic summarization. However, the fact that summarization had to be performed over very short texts together with the aforementioned quality conditions imposed on new generated titles led the authors to follow knowledge-rich and domain-dependent strategies for summarization, disregarding the more frequent extractive techniques for summarization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Key words: Markov-modulated queues, waiting time, heavy traffic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we test the extent to which producers' cooperatives can experience an increase in technical efficiency following a tightening of financial constraints. This hypothesis is tested on a sample of Italian conventional and cooperative firms for the wine production and processing sector, using frontier analysis. The results support the hypothesis that increasing financial pressure can affect positively the cooperatives efficiency. Journal compilation © CIRIEC 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. ^ Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. ^ Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is an increasing demand for DNA analysis because of the sensitivity of the method and the ability to uniquely identify and distinguish individuals with a high degree of certainty. But this demand has led to huge backlogs in evidence lockers since the current DNA extraction protocols require long processing time. The DNA analysis procedure becomes more complicated when analyzing sexual assault casework samples where the evidence contains more than one contributor. Additional processing to separate different cell types in order to simplify the final data interpretation further contributes to the existing cumbersome protocols. The goal of the present project is to develop a rapid and efficient extraction method that permits selective digestion of mixtures. Selective recovery of male DNA was achieved with as little as 15 minutes lysis time upon exposure to high pressure under alkaline conditions. Pressure cycling technology (PCT) is carried out in a barocycler that has a small footprint and is semi-automated. Typically less than 10% male DNA is recovered using the standard extraction protocol for rape kits, almost seven times more male DNA was recovered from swabs using this novel method. Various parameters including instrument setting and buffer composition were optimized to achieve selective recovery of sperm DNA. Some developmental validation studies were also done to determine the efficiency of this method in processing samples exposed to various conditions that can affect the quality of the extraction and the final DNA profile. Easy to use interface, minimal manual interference and the ability to achieve high yields with simple reagents in a relatively short time make this an ideal method for potential application in analyzing sexual assault samples.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reactive iron (oxyhydr)oxide minerals preferentially undergo early diagenetic redox cycling which can result in the production of dissolved Fe(II), adsorption of Fe(II) onto particle surfaces, and the formation of authigenic Fe minerals. The partitioning of iron in sediments has traditionally been studied by applying sequential extractions that target operationally-defined iron phases. Here, we complement an existing sequential leaching method by developing a sample processing protocol for d56Fe analysis, which we subsequently use to study Fe phase-specific fractionation related to dissimilatory iron reduction in a modern marine sediment. Carbonate-Fe was extracted by acetate, easily reducible oxides (e.g. ferrihydrite and lepidocrocite) by hydroxylamine-HCl, reducible oxides (e.g. goethite and hematite) by dithionite-citrate, and magnetite by ammonium oxalate. Subsequently, the samples were repeatedly oxidized, heated and purified via Fe precipitation and column chromatography. The method was applied to surface sediments collected from the North Sea, south of the Island of Helgoland. The acetate-soluble fraction (targeting siderite and ankerite) showed a pronounced downcore d56Fe trend. This iron pool was most depleted in 56Fe close to the sediment-water interface, similar to trends observed for pore-water Fe(II). We interpret this pool as surface-reduced Fe(II), rather than siderite or ankerite, that was open to electron and atom exchange with the oxide surface. Common extractions using 0.5 M HCl or Na-dithionite alone may not resolve such trends, as they dissolve iron from isotopically distinct pools leading to a mixed signal. Na-dithionite leaching alone, for example, targets the sum of reducible Fe oxides that potentially differ in their isotopic fingerprint. Hence, the development of a sequential extraction Fe isotope protocol provides a new opportunity for detailed study of the behavior of iron in a wide-range of environmental settings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Marine isotope stage (MIS) 9 is one of the least investigated Pleistocene interglaciations. The present study describes reconstructions of deepwater conditions during this time interval based on benthic foraminiferal assemblages from sediment core M23414 (Rockall Plateau, North Atlantic). The results of faunal analysis were supported by planktic d18O, sea surface temperature reconstructions based on planktic foraminiferal assemblages and content of ice rafted debris. Statistical data processing using principal component analysis revealed five climaterelated benthic foraminiferal associations that changed in response to alterations of deepwater circulation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The large upfront investments required for game development pose a severe barrier for the wider uptake of serious games in education and training. Also, there is a lack of well-established methods and tools that support game developers at preserving and enhancing the games’ pedagogical effectiveness. The RAGE project, which is a Horizon 2020 funded research project on serious games, addresses these issues by making available reusable software components that aim to support the pedagogical qualities of serious games. In order to easily deploy and integrate these game components in a multitude of game engines, platforms and programming languages, RAGE has developed and validated a hybrid component-based software architecture that preserves component portability and interoperability. While a first set of software components is being developed, this paper presents selected examples to explain the overall system’s concept and its practical benefits. First, the Emotion Detection component uses the learners’ webcams for capturing their emotional states from facial expressions. Second, the Performance Statistics component is an add-on for learning analytics data processing, which allows instructors to track and inspect learners’ progress without bothering about the required statistics computations. Third, a set of language processing components accommodate the analysis of textual inputs of learners, facilitating comprehension assessment and prediction. Fourth, the Shared Data Storage component provides a technical solution for data storage - e.g. for player data or game world data - across multiple software components. The presented components are exemplary for the anticipated RAGE library, which will include up to forty reusable software components for serious gaming, addressing diverse pedagogical dimensions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]This paper describes a face detection system which goes beyond traditional approaches normally designed for still images. First the video stream context is considered to apply the detector, and therefore, the resulting system is designed taking into consideration a main feature available in a video stream, i.e. temporal coherence. The resulting system builds a feature based model for each detected face, and searches them using various model information in the next frame. The results achieved for video stream processing outperform Rowley-Kanade's and Viola-Jones' solutions providing eye and face data in a reduced time with a notable correct detection rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent years have seen an astronomical rise in SQL Injection Attacks (SQLIAs) used to compromise the confidentiality, authentication and integrity of organisations’ databases. Intruders becoming smarter in obfuscating web requests to evade detection combined with increasing volumes of web traffic from the Internet of Things (IoT), cloud-hosted and on-premise business applications have made it evident that the existing approaches of mostly static signature lack the ability to cope with novel signatures. A SQLIA detection and prevention solution can be achieved through exploring an alternative bio-inspired supervised learning approach that uses input of labelled dataset of numerical attributes in classifying true positives and negatives. We present in this paper a Numerical Encoding to Tame SQLIA (NETSQLIA) that implements a proof of concept for scalable numerical encoding of features to a dataset attributes with labelled class obtained from deep web traffic analysis. In the numerical attributes encoding: the model leverages proxy in the interception and decryption of web traffic. The intercepted web requests are then assembled for front-end SQL parsing and pattern matching by applying traditional Non-Deterministic Finite Automaton (NFA). This paper is intended for a technique of numerical attributes extraction of any size primed as an input dataset to an Artificial Neural Network (ANN) and statistical Machine Learning (ML) algorithms implemented using Two-Class Averaged Perceptron (TCAP) and Two-Class Logistic Regression (TCLR) respectively. This methodology then forms the subject of the empirical evaluation of the suitability of this model in the accurate classification of both legitimate web requests and SQLIA payloads.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Relatório de Estágio apresentado à Escola Superior de Educação de Paula Frassinetti para obtenção de grau de Mestre em Educação Pré-Escolar

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Esta dissertação apresenta uma arquitectura interoperável que permite lidar com a obtenção, manipulação, processamento e análise de informação geográfica. A aplicação 30, implementada como parte da arquitectura, para além de permitir a visualização e manipulação de dados dentro de um ambiente 30, oferece métodos que permitem descobrir, aceder e usar geo-processos, disponíveis através de serviços Web. A interacção com o utilizador é também feita através uma abordagem que quebra a típica complexidade que a maioria dos Sistemas de Informação Geográfica apresenta. O recurso à programação visual reduz a complexidade do sistema, e permite aos operadores tirar proveito da localização e de uma abstracção de um processo complexo, onde as unidades de processamento são representadas no terreno através de componentes 30 que podem ser directamente manipuladas e ligadas de modo a criar encandeamentos complexos de processos. Estes processos podem também ser criados visualmente e disponibilizados online. ABSTRACT; This thesis presents an interoperable architecture mainly designed for manipulation, processing and geographical information analysis. The three-dimensional interface, implemented as part of the architecture, besides allowing the visualization and manipulation of spatial data within a 30 environment, offers methods for discovering, accessing and using geo-processes, available through Web Services. Furthermore, the user interaction is done through an approach that breaks the typical complexity of most Geographic information Systems. This simplicity is in general archived through a visual programming approach that allows operators to take advantage of location, and use processes through abstract representations. Thus, processing units are represented on the terrain through 30 components, which can be directly manipulated and linked to create complex process chains. New processes can also be visually created and deployed online.