991 resultados para process parameter data


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Data analytic applications are characterized by large data sets that are subject to a series of processing phases. Some of these phases are executed sequentially but others can be executed concurrently or in parallel on clusters, grids or clouds. The MapReduce programming model has been applied to process large data sets in cluster and cloud environments. For developing an application using MapReduce there is a need to install/configure/access specific frameworks such as Apache Hadoop or Elastic MapReduce in Amazon Cloud. It would be desirable to provide more flexibility in adjusting such configurations according to the application characteristics. Furthermore the composition of the multiple phases of a data analytic application requires the specification of all the phases and their orchestration. The original MapReduce model and environment lacks flexible support for such configuration and composition. Recognizing that scientific workflows have been successfully applied to modeling complex applications, this paper describes our experiments on implementing MapReduce as subworkflows in the AWARD framework (Autonomic Workflow Activities Reconfigurable and Dynamic). A text mining data analytic application is modeled as a complex workflow with multiple phases, where individual workflow nodes support MapReduce computations. As in typical MapReduce environments, the end user only needs to define the application algorithms for input data processing and for the map and reduce functions. In the paper we present experimental results when using the AWARD framework to execute MapReduce workflows deployed over multiple Amazon EC2 (Elastic Compute Cloud) instances.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Dissertação de mestrado integrado em Engenharia e Gestão de Sistemas de Informação

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada a la National Oceanography Centre of Southampton (NOCS), Gran Bretanya, entre maig i juliol del 2006. La possibilitat d’obtenir una estimació precissa de la salinitat marina (SSS) és important per a investigar i predir l’extensió del fenòmen del canvi climàtic. La missió Soil Moisture and Ocean Salinity (SMOS) va ser seleccionada per l’Agència Espacial Europea (ESA) per a obtenir mapes de salinitat de la superfície marina a escala global i amb un temps de revisita petit. Abans del llençament de SMOS es preveu l’anàlisi de la variabilitat horitzontal de la SSS i del potencial de les dades recuperades a partir de mesures de SMOS per a reproduir comportaments oceanogràfics coneguts. L’objectiu de tot plegat és emplenar el buit existent entre les fonts de dades d’entrada/auxiliars fiables i les eines desenvolupades per a simular i processar les dades adquirides segons la configuració de SMOS. El SMOS End-to-end Performance Simulator (SEPS) és un simulador adhoc desenvolupat per la Universitat Politècnica de Catalunya (UPC) per a generar dades segons la configuració de SMOS. Es va utilitzar dades d’entrada a SEPS procedents del projecte Ocean Circulation and Climate Advanced Modeling (OCCAM), utilitzat al NOCS, a diferents resolucions espacials. Modificant SEPS per a poder fer servir com a entrada les dades OCCAM es van obtenir dades de temperatura de brillantor simulades durant un mes amb diferents observacions ascendents que cobrien la zona seleccionada. Les tasques realitzades durant l’estada a NOCS tenien la finalitat de proporcionar una tècnica fiable per a realitzar la calibració externa i per tant cancel•lar el bias, una metodologia per a promitjar temporalment les diferents adquisicions durant les observacions ascendents, i determinar la millor configuració de la funció de cost abans d’explotar i investigar les posibiltats de les dades SEPS/OCCAM per a derivar la SSS recuperada amb patrons d’alta resolució.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

While much of the literature on immigrants' assimilation has focused on countries with a large tradition of receiving immigrants and with flexible labor markets, very little is known on how immigrants adjust to other types of host economies. With its severe dual labor market, and an unprecedented immigration boom, Spain presents a quite unique experience to analyze immigrations' assimilation process. Using data from the 2000 to 2008 Labor Force Survey, we find that immigrants are more occupationally mobile than natives, and that much of this greater flexibility is explained by immigrants' assimilation process soon after arrival. However, we find little evidence of convergence, especially among women and high skilled immigrants. This suggests that instead of integrating, immigrants occupationally segregate, providing evidence consistent with both imperfect substitutability and immigrants' human capital being under-valued. Additional evidence on the assimilation of earnings and the incidence of permanent employment by different skill levels also supports the hypothesis of segmented labor markets.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La présente étude est à la fois une évaluation du processus de la mise en oeuvre et des impacts de la police de proximité dans les cinq plus grandes zones urbaines de Suisse - Bâle, Berne, Genève, Lausanne et Zurich. La police de proximité (community policing) est à la fois une philosophie et une stratégie organisationnelle qui favorise un partenariat renouvelé entre la police et les communautés locales dans le but de résoudre les problèmes relatifs à la sécurité et à l'ordre public. L'évaluation de processus a analysé des données relatives aux réformes internes de la police qui ont été obtenues par l'intermédiaire d'entretiens semi-structurés avec des administrateurs clés des cinq départements de police, ainsi que dans des documents écrits de la police et d'autres sources publiques. L'évaluation des impacts, quant à elle, s'est basée sur des variables contextuelles telles que des statistiques policières et des données de recensement, ainsi que sur des indicateurs d'impacts construit à partir des données du Swiss Crime Survey (SCS) relatives au sentiment d'insécurité, à la perception du désordre public et à la satisfaction de la population à l'égard de la police. Le SCS est un sondage régulier qui a permis d'interroger des habitants des cinq grandes zones urbaines à plusieurs reprises depuis le milieu des années 1980. L'évaluation de processus a abouti à un « Calendrier des activités » visant à créer des données de panel permettant de mesurer les progrès réalisés dans la mise en oeuvre de la police de proximité à l'aide d'une grille d'évaluation à six dimensions à des intervalles de cinq ans entre 1990 et 2010. L'évaluation des impacts, effectuée ex post facto, a utilisé un concept de recherche non-expérimental (observational design) dans le but d'analyser les impacts de différents modèles de police de proximité dans des zones comparables à travers les cinq villes étudiées. Les quartiers urbains, délimités par zone de code postal, ont ainsi été regroupés par l'intermédiaire d'une typologie réalisée à l'aide d'algorithmes d'apprentissage automatique (machine learning). Des algorithmes supervisés et non supervisés ont été utilisés sur les données à haute dimensionnalité relatives à la criminalité, à la structure socio-économique et démographique et au cadre bâti dans le but de regrouper les quartiers urbains les plus similaires dans des clusters. D'abord, les cartes auto-organisatrices (self-organizing maps) ont été utilisées dans le but de réduire la variance intra-cluster des variables contextuelles et de maximiser simultanément la variance inter-cluster des réponses au sondage. Ensuite, l'algorithme des forêts d'arbres décisionnels (random forests) a permis à la fois d'évaluer la pertinence de la typologie de quartier élaborée et de sélectionner les variables contextuelles clés afin de construire un modèle parcimonieux faisant un minimum d'erreurs de classification. Enfin, pour l'analyse des impacts, la méthode des appariements des coefficients de propension (propensity score matching) a été utilisée pour équilibrer les échantillons prétest-posttest en termes d'âge, de sexe et de niveau d'éducation des répondants au sein de chaque type de quartier ainsi identifié dans chacune des villes, avant d'effectuer un test statistique de la différence observée dans les indicateurs d'impacts. De plus, tous les résultats statistiquement significatifs ont été soumis à une analyse de sensibilité (sensitivity analysis) afin d'évaluer leur robustesse face à un biais potentiel dû à des covariables non observées. L'étude relève qu'au cours des quinze dernières années, les cinq services de police ont entamé des réformes majeures de leur organisation ainsi que de leurs stratégies opérationnelles et qu'ils ont noué des partenariats stratégiques afin de mettre en oeuvre la police de proximité. La typologie de quartier développée a abouti à une réduction de la variance intra-cluster des variables contextuelles et permet d'expliquer une partie significative de la variance inter-cluster des indicateurs d'impacts avant la mise en oeuvre du traitement. Ceci semble suggérer que les méthodes de géocomputation aident à équilibrer les covariables observées et donc à réduire les menaces relatives à la validité interne d'un concept de recherche non-expérimental. Enfin, l'analyse des impacts a révélé que le sentiment d'insécurité a diminué de manière significative pendant la période 2000-2005 dans les quartiers se trouvant à l'intérieur et autour des centres-villes de Berne et de Zurich. Ces améliorations sont assez robustes face à des biais dus à des covariables inobservées et covarient dans le temps et l'espace avec la mise en oeuvre de la police de proximité. L'hypothèse alternative envisageant que les diminutions observées dans le sentiment d'insécurité soient, partiellement, un résultat des interventions policières de proximité semble donc être aussi plausible que l'hypothèse nulle considérant l'absence absolue d'effet. Ceci, même si le concept de recherche non-expérimental mis en oeuvre ne peut pas complètement exclure la sélection et la régression à la moyenne comme explications alternatives. The current research project is both a process and impact evaluation of community policing in Switzerland's five major urban areas - Basel, Bern, Geneva, Lausanne, and Zurich. Community policing is both a philosophy and an organizational strategy that promotes a renewed partnership between the police and the community to solve problems of crime and disorder. The process evaluation data on police internal reforms were obtained through semi-structured interviews with key administrators from the five police departments as well as from police internal documents and additional public sources. The impact evaluation uses official crime records and census statistics as contextual variables as well as Swiss Crime Survey (SCS) data on fear of crime, perceptions of disorder, and public attitudes towards the police as outcome measures. The SCS is a standing survey instrument that has polled residents of the five urban areas repeatedly since the mid-1980s. The process evaluation produced a "Calendar of Action" to create panel data to measure community policing implementation progress over six evaluative dimensions in intervals of five years between 1990 and 2010. The impact evaluation, carried out ex post facto, uses an observational design that analyzes the impact of the different community policing models between matched comparison areas across the five cities. Using ZIP code districts as proxies for urban neighborhoods, geospatial data mining algorithms serve to develop a neighborhood typology in order to match the comparison areas. To this end, both unsupervised and supervised algorithms are used to analyze high-dimensional data on crime, the socio-economic and demographic structure, and the built environment in order to classify urban neighborhoods into clusters of similar type. In a first step, self-organizing maps serve as tools to develop a clustering algorithm that reduces the within-cluster variance in the contextual variables and simultaneously maximizes the between-cluster variance in survey responses. The random forests algorithm then serves to assess the appropriateness of the resulting neighborhood typology and to select the key contextual variables in order to build a parsimonious model that makes a minimum of classification errors. Finally, for the impact analysis, propensity score matching methods are used to match the survey respondents of the pretest and posttest samples on age, gender, and their level of education for each neighborhood type identified within each city, before conducting a statistical test of the observed difference in the outcome measures. Moreover, all significant results were subjected to a sensitivity analysis to assess the robustness of these findings in the face of potential bias due to some unobserved covariates. The study finds that over the last fifteen years, all five police departments have undertaken major reforms of their internal organization and operating strategies and forged strategic partnerships in order to implement community policing. The resulting neighborhood typology reduced the within-cluster variance of the contextual variables and accounted for a significant share of the between-cluster variance in the outcome measures prior to treatment, suggesting that geocomputational methods help to balance the observed covariates and hence to reduce threats to the internal validity of an observational design. Finally, the impact analysis revealed that fear of crime dropped significantly over the 2000-2005 period in the neighborhoods in and around the urban centers of Bern and Zurich. These improvements are fairly robust in the face of bias due to some unobserved covariate and covary temporally and spatially with the implementation of community policing. The alternative hypothesis that the observed reductions in fear of crime were at least in part a result of community policing interventions thus appears at least as plausible as the null hypothesis of absolutely no effect, even if the observational design cannot completely rule out selection and regression to the mean as alternative explanations.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Quadrennial Needs Study was developed to assist in the identification of highway needs and the distribution of road funds in Iowa among the various highway entities. During the period 1978 to 1990, the process has seen large shifts in needs and associated funding distribution in individual counties with no apparent reasons. This study investigated the reasons for such shifts. The study identified program inputs that can result in major shifts in needs either up or down from minor changes in the input values. The areas of concern were identified as the condition ratings for roads and structures, traffic volume and mix counts, and the assignment of construction cost areas. Eight counties exhibiting the large shifts (greater than 30%) in needs over time were used to test the sensitivity of the variables. A ninth county was used as the base line for the study. Recommendations are identified for improvements in the process of data collection in the areas of road and structure condition--rating, traffic, and in the assignment of construction cost areas. Advice is also offered in how to account for changes in jurisdiction between successive studies. Maintenance cost area assignment and levels of maintenance service are identified as requiring additional detailed research.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this work we present a simulation of a recognition process with perimeter characterization of a simple plant leaves as a unique discriminating parameter. Data coding allowing for independence of leaves size and orientation may penalize performance recognition for some varieties. Border description sequences are then used to characterize the leaves. Independent Component Analysis (ICA) is then applied in order to study which is the best number of components to be considered for the classification task, implemented by means of an Artificial Neural Network (ANN). Obtained results with ICA as a pre-processing tool are satisfactory, and compared with some references our system improves the recognition success up to 80.8% depending on the number of considered independent components.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The building industry has a particular interest in using clinching as a joining method for frame constructions of light-frame housing. Normally many clinch joints are required in joining of frames.In order to maximise the strength of the complete assembly, each clinch joint must be as sound as possible. Experimental testing is the main means of optimising a particular clinch joint. This includes shear strength testing and visual observation of joint cross-sections. The manufacturers of clinching equipment normally perform such experimental trials. Finite element analysis can also be used to optimise the tool geometry and the process parameter, X, which represents the thickness of the base of the joint. However, such procedures require dedicated software, a skilled operator, and test specimens in order to verify the finite element model. In addition, when using current technology several hours' computing time may be necessary. The objective of the study was to develop a simple calculation procedure for rapidly establishing an optimum value for the parameter X for a given tool combination. It should be possible to use the procedure on a daily basis, without stringent demands on the skill of the operator or the equipment. It is also desirable that the procedure would significantly decrease thenumber of shear strength tests required for verification. The experimental workinvolved tests in order to obtain an understanding of the behaviour of the sheets during clinching. The most notable observation concerned the stage of the process in which the upper sheet was initially bent, after which the deformation mechanism changed to shearing and elongation. The amount of deformation was measured relative to the original location of the upper sheet, and characterised as the C-measure. By understanding in detail the behaviour of the upper sheet, it waspossible to estimate a bending line function for the surface of the upper sheet. A procedure was developed, which makes it possible to estimate the process parameter X for each tool combination with a fixed die. The procedure is based on equating the volume of material on the punch side with the volume of the die. Detailed information concerning the behaviour of material on the punch side is required, assuming that the volume of die does not change during the process. The procedure was applied to shear strength testing of a sample material. The sample material was continuously hot-dip zinc-coated high-strength constructional steel,with a nominal thickness of 1.0 mm. The minimum Rp0.2 proof stress was 637 N/mm2. Such material has not yet been used extensively in light-frame housing, and little has been published on clinching of the material. The performance of the material is therefore of particular interest. Companies that use clinching on a daily basis stand to gain the greatest benefit from the procedure. By understanding the behaviour of sheets in different cases, it is possible to use data at an early stage for adjusting and optimising the process. In particular, the functionality of common tools can be increased since it is possible to characterise the complete range of existing tools. The study increases and broadens the amount ofbasic information concerning the clinching process. New approaches and points of view are presented and used for generating new knowledge.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Laser additive manufacturing (LAM), known also as 3D printing, has gained a lot of interest in past recent years within various industries, such as medical and aerospace industries. LAM enables fabrication of complex 3D geometries by melting metal powder layer by layer with laser beam. Research in laser additive manufacturing has been focused in development of new materials and new applications in past 10 years. Since this technology is on cutting edge, efficiency of manufacturing process is in center role of research of this industry. Aim of this thesis is to characterize methods for process efficiency improvements in laser additive manufacturing. The aim is also to clarify the effect of process parameters to the stability of the process and in microstructure of manufactured pieces. Experimental tests of this thesis were made with various process parameters and their effect on build pieces has been studied, when additive manufacturing was performed with a modified research machine representing EOSINT M-series and with EOS EOSINT M280. Material used was stainless steel 17-4 PH. Also, some of the methods for process efficiency improvements were tested. Literature review of this thesis presents basics of laser additive manufacturing, methods for improve the process efficiency and laser beam – material- interaction. It was observed that there are only few public studies about process efficiency of laser additive manufacturing of stainless steel. According to literature, it is possible to improve process efficiency with higher power lasers and thicker layer thicknesses. The process efficiency improvement is possible if the effect of process parameter changes in manufactured pieces is known. According to experiments carried out in this thesis, it was concluded that process parameters have major role in single track formation in laser additive manufacturing. Rough estimation equations were created to describe the effect of input parameters to output parameters. The experimental results showed that the WDA (width-depth-area of cross-sections of single track) is correlating exponentially with energy density input. The energy density input is combination of the input parameters of laser power, laser beam spot diameter and scan speed. The use of skin-core technique enables improvement of process efficiency as the core of the part is manufactured with higher laser power and thicker layer thickness and the skin with lower laser power and thinner layer thickness in order to maintain high resolution. In this technique the interface between skin and core must have overlapping in order to achieve full dense parts. It was also noticed in this thesis that keyhole can be formed in LAM process. It was noticed that the threshold intensity value of 106 W/cm2 was exceeded during the tests. This means that in these tests the keyhole formation was possible.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Poster at Open Repositories 2014, Helsinki, Finland, June 9-13, 2014

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The thesis aims to build a coherent view and understanding of the innovation process and organizational technology adoption in Finnish bio-economy companies with a focus on innovations of a disruptive nature. Disruptive innovations are exceptional hence in order to create generalizations and a unified view of the subject the perspective is also on less radical innovations. Other interests of the thesis are how ideas are discovered and generated and how the nature of the innovation and size of the company affect the technology adoption and innovation process. The data was collected by interviewing six small and six large Finnish bio-economy companies. The results suggest companies regardless of size consider innovation as a core asset in the competitive markets. Organizations want to be considered innovators and early adopters yet these qualities are limited by certain, mainly resource-based factors. In addition the industry, scalability and Finland’s geographical location when seeking funding provide certain challenges. The innovation process may be considered relatively similar whether the idea or technology stems from an internal or external source suggesting the technology adoption process can in fact be linked to the innovation process theories. Thus the thesis introduces a new theoretical model which based on the results of the study and the theories of technology adoption and innovation process aims on characterizing how ideas and technology from both external and internal sources generate into innovations. The innovation process is in large bio-economy companies most often similar to or a modified version of the stage-gate model, while small companies generally have less structured processes. Nevertheless the more disruptive the innovation, the less it fits in the structured processes. This implies disruptive innovation cannot be put in a certain mould but it is rather processed case-by-case.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The recent rapid development of biotechnological approaches has enabled the production of large whole genome level biological data sets. In order to handle thesedata sets, reliable and efficient automated tools and methods for data processingand result interpretation are required. Bioinformatics, as the field of studying andprocessing biological data, tries to answer this need by combining methods and approaches across computer science, statistics, mathematics and engineering to studyand process biological data. The need is also increasing for tools that can be used by the biological researchers themselves who may not have a strong statistical or computational background, which requires creating tools and pipelines with intuitive user interfaces, robust analysis workflows and strong emphasis on result reportingand visualization. Within this thesis, several data analysis tools and methods have been developed for analyzing high-throughput biological data sets. These approaches, coveringseveral aspects of high-throughput data analysis, are specifically aimed for gene expression and genotyping data although in principle they are suitable for analyzing other data types as well. Coherent handling of the data across the various data analysis steps is highly important in order to ensure robust and reliable results. Thus,robust data analysis workflows are also described, putting the developed tools andmethods into a wider context. The choice of the correct analysis method may also depend on the properties of the specific data setandthereforeguidelinesforchoosing an optimal method are given. The data analysis tools, methods and workflows developed within this thesis have been applied to several research studies, of which two representative examplesare included in the thesis. The first study focuses on spermatogenesis in murinetestis and the second one examines cell lineage specification in mouse embryonicstem cells.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this study, an infrared thermography based sensor was studied with regard to usability and the accuracy of sensor data as a weld penetration signal in gas metal arc welding. The object of the study was to evaluate a specific sensor type which measures thermography from solidified weld surface. The purpose of the study was to provide expert data for developing a sensor system in adaptive metal active gas (MAG) welding. Welding experiments with considered process variables and recorded thermal profiles were saved to a database for further analysis. To perform the analysis within a reasonable amount of experiments, the process parameter variables were gradually altered by at least 10 %. Later, the effects of process variables on weld penetration and thermography itself were considered. SFS-EN ISO 5817 standard (2014) was applied for classifying the quality of the experiments. As a final step, a neural network was taught based on the experiments. The experiments show that the studied thermography sensor and the neural network can be used for controlling full penetration though they have minor limitations, which are presented in results and discussion. The results are consistent with previous studies and experiments found in the literature.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis explored Canadian high performance Athletes' perceptions of the fairness of the SDRCC sport-specific arbitral process. Leventhal’s (1980) model of procedural justice judgment was found to be an effective tool for exploring Athletes’ perceptions of the fairness of the process. Five of his six procedural justice antecedents: consistency, bias suppression, accuracy of information, representativeness, and ethicality influenced the Athletes’ perceptions of the fairness of the process. Emergent data also revealed that the Athletes’ perceptions of fairness were also influenced by three contextual factors and an additional antecedent of procedural justice. Efficiency of the process, inherent power imbalance between Athletes and NSOs, and the measurable effect of the process on personal and professional relationships differentiate sport-specific arbitration from most other processes of allocation. The data also indicated that the opportunity to voice one’s case was also an important determinant of the Athletes’ perceptions of the fairness of the process.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article analyses the results of an empirical study on the 200 most popular UK-based websites in various sectors of e-commerce services. The study provides empirical evidence on unlawful processing of personal data. It comprises a survey on the methods used to seek and obtain consent to process personal data for direct marketing and advertisement, and a test on the frequency of unsolicited commercial emails (UCE) received by customers as a consequence of their registration and submission of personal information to a website. Part One of the article presents a conceptual and normative account of data protection, with a discussion of the ethical values on which EU data protection law is grounded and an outline of the elements that must be in place to seek and obtain valid consent to process personal data. Part Two discusses the outcomes of the empirical study, which unveils a significant departure between EU legal theory and practice in data protection. Although a wide majority of the websites in the sample (69%) has in place a system to ask separate consent for engaging in marketing activities, it is only 16.2% of them that obtain a consent which is valid under the standards set by EU law. The test with UCE shows that only one out of three websites (30.5%) respects the will of the data subject not to receive commercial communications. It also shows that, when submitting personal data in online transactions, there is a high probability (50%) of incurring in a website that will ignore the refusal of consent and will send UCE. The article concludes that there is severe lack of compliance of UK online service providers with essential requirements of data protection law. In this respect, it suggests that there is inappropriate standard of implementation, information and supervision by the UK authorities, especially in light of the clarifications provided at EU level.