887 resultados para Structured and unstructured orchestration components
Resumo:
Most modern models of personality are hierarchical, perhaps as a result of their development by means of exploratory factor analysis. Based on new ideas about the structure of personality and how it divides into biologically based and sociocognitively based components (as proposed by Carver, Cloninger, EUiot and Thrash, and ReveUe), I develop a series of rules that show how scales of personality may be linked from those that are most distal to those which are most proximal. I use SEM to confirm the proposed structure in scales of the Temperament Character Inventory (TCI) and the Eysenck Personality Profiler. Good fit is achieved and all proposed paths are significant. The model is then used to predict work performance, deviance and job satisfacdon.
Resumo:
Most modern models of personality are hierarchical, perhaps as a result of their development by means of exploratory factor analysis. Based on new ideas about the structure of personality and how it divides into biologically based and sociocognitively based components (as proposed by Carver, Cloninger, EUiot and Thrash, and ReveUe), I develop a series of rules that show how scales of personality may be linked from those that are most distal to those which are most proximal. I use SEM to confirm the proposed structure in scales of the Temperament Character Inventory (TCI) and the Eysenck Personality Profiler. Good fit is achieved and all proposed paths are significant. The model is then used to predict work performance, deviance and job satisfacdon.
Resumo:
O trabalho desenvolvido analisa a Comunicação Social no contexto da internet e delineia novas metodologias de estudo para a área na filtragem de significados no âmbito científico dos fluxos de informação das redes sociais, mídias de notícias ou qualquer outro dispositivo que permita armazenamento e acesso a informação estruturada e não estruturada. No intento de uma reflexão sobre os caminhos, que estes fluxos de informação se desenvolvem e principalmente no volume produzido, o projeto dimensiona os campos de significados que tal relação se configura nas teorias e práticas de pesquisa. O objetivo geral deste trabalho é contextualizar a área da Comunicação Social dentro de uma realidade mutável e dinâmica que é o ambiente da internet e fazer paralelos perante as aplicações já sucedidas por outras áreas. Com o método de estudo de caso foram analisados três casos sob duas chaves conceituais a Web Sphere Analysis e a Web Science refletindo os sistemas de informação contrapostos no quesito discursivo e estrutural. Assim se busca observar qual ganho a Comunicação Social tem no modo de visualizar seus objetos de estudo no ambiente das internet por essas perspectivas. O resultado da pesquisa mostra que é um desafio para o pesquisador da Comunicação Social buscar novas aprendizagens, mas a retroalimentação de informação no ambiente colaborativo que a internet apresenta é um caminho fértil para pesquisa, pois a modelagem de dados ganha corpus analítico quando o conjunto de ferramentas promovido e impulsionado pela tecnologia permite isolar conteúdos e possibilita aprofundamento dos significados e suas relações.
Resumo:
Recently, we have seen an explosion of interest in ontologies as artifacts to represent human knowledge and as critical components in knowledge management, the semantic Web, business-to-business applications, and several other application areas. Various research communities commonly assume that ontologies are the appropriate modeling structure for representing knowledge. However, little discussion has occurred regarding the actual range of knowledge an ontology can successfully represent.
Resumo:
We have proposed a novel robust inversion-based neurocontroller that searches for the optimal control law by sampling from the estimated Gaussian distribution of the inverse plant model. However, for problems involving the prediction of continuous variables, a Gaussian model approximation provides only a very limited description of the properties of the inverse model. This is usually the case for problems in which the mapping to be learned is multi-valued or involves hysteritic transfer characteristics. This often arises in the solution of inverse plant models. In order to obtain a complete description of the inverse model, a more general multicomponent distributions must be modeled. In this paper we test whether our proposed sampling approach can be used when considering an arbitrary conditional probability distributions. These arbitrary distributions will be modeled by a mixture density network. Importance sampling provides a structured and principled approach to constrain the complexity of the search space for the ideal control law. The effectiveness of the importance sampling from an arbitrary conditional probability distribution will be demonstrated using a simple single input single output static nonlinear system with hysteretic characteristics in the inverse plant model.
Resumo:
This article examines female response to gender role portrayals in advertising for Ukraine and Turkey. Being both new potential EU candidates, we argue that gender stereotype could also be used as a \u2018barometer\u2019 of progress and closure towards a more generally accepted EU behaviour against women. While their history remains different, both from a political and society values point of views, constraints are currently being faced that require convergence or justification of practices and understanding. Principal components analysis is employed over 290 questionnaires to identify the underlying dimensions. Results indicate overall similarities in perceptions, fragmentation within groups, but seem to provide divergence regarding thresholds.
Resumo:
There have been two main approaches to feature detection in human and computer vision - based either on the luminance distribution and its spatial derivatives, or on the spatial distribution of local contrast energy. Thus, bars and edges might arise from peaks of luminance and luminance gradient respectively, or bars and edges might be found at peaks of local energy, where local phases are aligned across spatial frequency. This basic issue of definition is important because it guides more detailed models and interpretations of early vision. Which approach better describes the perceived positions of features in images? We used the class of 1-D images defined by Morrone and Burr in which the amplitude spectrum is that of a (partially blurred) square-wave and all Fourier components have a common phase. Observers used a cursor to mark where bars and edges were seen for different test phases (Experiment 1) or judged the spatial alignment of contours that had different phases (e.g. 0 degrees and 45 degrees ; Experiment 2). The feature positions defined by both tasks shifted systematically to the left or right according to the sign of the phase offset, increasing with the degree of blur. These shifts were well predicted by the location of luminance peaks (bars) and gradient peaks (edges), but not by energy peaks which (by design) predicted no shift at all. These results encourage models based on a Gaussian-derivative framework, but do not support the idea that human vision uses points of phase alignment to find local, first-order features. Nevertheless, we argue that both approaches are presently incomplete and a better understanding of early vision may combine insights from both. (C)2004 Elsevier Ltd. All rights reserved.
Resumo:
The development of new products in today's marketing environment is generally accepted as a requirement for the continual growth and prosperity of organisations. The literature is consequently rich with information on the development of various aspects of good products. In the case of service industries, it can be argued that new service product development is of as least equal importance as it is to organisations that produce tangible goods products. Unlike the new goods product literature, the literature on service marketing practices, and in particular, new service product development, is relatively sparse. The main purpose of this thesis is to examine a number of aspects of new service product development practice with respect to financial services and specifically, credit card financial services. The empirical investigation utilises both a case study and a survey approach, to examine aspects of new service product development industry practice relating specifically to gaps and deficiencies in the literature with respect to the financial service industry. The findings of the empirical work are subsequently examined in the context in which they provide guidance and support for a new normative new service product development model. The study examines the UK credit card financial service product sector as an industry case study and perspective. The findings of the field work reveal that the new service product development process is still evolving, and that in the case of credit card financial services can be seen as a well-structured and well-documented process. New product development can also be seen as an incremental, complex, interactive and continuous process which has been applied in a variety of ways. A number of inferences are subsequently presented.
Resumo:
This thesis is organised into three parts. In Part 1 relevant literature is reviewed and three critical components in the development of a cognitive approach to instruction are identified. These three components are considered to be the structure of the subject-matter, the learner's cognitive structures, and the learner's cognitive strategies which act as control and transfer devices between the instructional materials and the learner's cognitive structures. Six experiments are described in Part 2 which is divided into two methodologically distinct units. The three experiments of Unit 1 examined how learning from materials constructed from concept name by concept attribute matrices is influenced by learner or experimenter controlled sequence and organisation. The results suggested that the relationships between input organisation, output organisation and recall are complex and highlighted the importance of investigating organisational strategies at both acquisition and recall. The role of subjects previously acquired knowledge and skills in relation to the instructional material was considered to be an important factor. The three experiments of Unit 2 utilised a "diagramming relationships methodology" which was devised as one means of investigating the processes by which new information is assimilated into an individual's cognitive structure. The methodology was found to be useful in identifying cognitive strategies related to successful task performance. The results suggested that errors could be minimised and comprehension improved on the diagramming relationships task by instructing subjects in ways which induced successful processing operations. Part 3 of this thesis highlights salient issues raised by the experimental work within the framework outlined in Part 1 and discusses potential implications for future theoretical developments and research.
Resumo:
This thesis first considers the calibration and signal processing requirements of a neuromagnetometer for the measurement of human visual function. Gradiometer calibration using straight wire grids is examined and optimal grid configurations determined, given realistic constructional tolerances. Simulations show that for gradiometer balance of 1:104 and wire spacing error of 0.25mm the achievable calibration accuracy of gain is 0.3%, of position is 0.3mm and of orientation is 0.6°. Practical results with a 19-channel 2nd-order gradiometer based system exceed this performance. The real-time application of adaptive reference noise cancellation filtering to running-average evoked response data is examined. In the steady state, the filter can be assumed to be driven by a non-stationary step input arising at epoch boundaries. Based on empirical measures of this driving step an optimal progression for the filter time constant is proposed which improves upon fixed time constant filter performance. The incorporation of the time-derivatives of the reference channels was found to improve the performance of the adaptive filtering algorithm by 15-20% for unaveraged data, falling to 5% with averaging. The thesis concludes with a neuromagnetic investigation of evoked cortical responses to chromatic and luminance grating stimuli. The global magnetic field power of evoked responses to the onset of sinusoidal gratings was shown to have distinct chromatic and luminance sensitive components. Analysis of the results, using a single equivalent current dipole model, shows that these components arise from activity within two distinct cortical locations. Co-registration of the resulting current source localisations with MRI shows a chromatically responsive area lying along the midline within the calcarine fissure, possibly extending onto the lingual and cuneal gyri. It is postulated that this area is the human homologue of the primate cortical area V4.
Resumo:
Baths containing sulphuric acid as catalyst and others with selected secondary catalysts (methane sulphonic acid - MSA, SeO2, a KBrO3/KIO3 mixture, indium, uranium and commercial high speed catalysts (HEEF-25 and HEEF-405)) were studied. The secondary catalysts influenced CCE, brightness and cracking. Chromium deposition mechanisms were studied in Part II using potentiostatic and potentiodynamic electroanalytical techniques under stationary and hydrodynamic conditions. Sulphuric acid as a primary catalyst and MSA, HEEF-25, HEEF-405 and sulphosalycilic acid as co-catalysts were explored for different rotation, speeds and scan rates. Maximum current was resolved into diffusion and kinetically limited components, and a contribution towards understanding the electrochemical mechanism is proposed. Reaction kinetics were further studied for H2SO4, MSA and methane disulphonic acid catalysed systems and their influence on reaction mechanisms elaborated. Charge transfer coefficient and electrochemical reaction rate orders for the first stage of the electrodeposition process were determined. A contribution was made toward understanding of H2SO4 and MSA influence on the evolution rate of hydrogen. Anodic dissolution of chromium in the chromic acid solution was studied with a number of techniques. An electrochemical dissolution mechanism is proposed, based on the results of rotating gold ring disc experiments and scanning electron microscopy. Finally, significant increases in chromium electrodeposition rates under non-stationary conditions (PRC mode) were studied and a deposition mechanisms is elaborated based on experimental data and theoretical considerations.
Resumo:
Most contemporary models of spatial vision include a cross-oriented route to suppression (masking from a broadly tuned inhibitory pool), which is most potent at low spatial and high temporal frequencies (T. S. Meese & D. J. Holmes, 2007). The influence of this pathway can elevate orientation-masking functions without exciting the target mechanism, and because early psychophysical estimates of filter bandwidth did not accommodate this, it is likely that they have been overestimated for this corner of stimulus space. Here we show that a transient 40% contrast mask causes substantial binocular threshold elevation for a transient vertical target, and this declines from a mask orientation of 0° to about 40° (indicating tuning), and then more gently to 90°, where it remains at a factor of ∼4. We also confirm that cross-orientation masking is diminished or abolished at high spatial frequencies and for sustained temporal modulation. We fitted a simple model of pedestal masking and cross-orientation suppression (XOS) to our data and those of G. C. Phillips and H. R. Wilson (1984) and found the dependency of orientation bandwidth on spatial frequency to be much less than previously supposed. An extension of our linear spatial pooling model of contrast gain control and dilution masking (T. S. Meese & R. J. Summers, 2007) is also shown to be consistent with our results using filter bandwidths of ±20°. Both models include tightly and broadly tuned components of divisive suppression. More generally, because XOS and/or dilution masking can affect the shape of orientation-masking curves, we caution that variations in bandwidth estimates might reflect variations in processes that have nothing to do with filter bandwidth.
Resumo:
Purpose – There appears to be an ever-insatiable demand from markets for organisations to improve their products and services. To meet this, there is a need to provide business process improvement (BPI) methodologies that are holistic, structured and procedural. Therefore, this paper describes research that has formed and tested a generic and practical methodology termed model-based and integrated process improvement (MIPI) to support the implementation of BPI; and to validate its effectiveness in organisations. This methodology has been created as an aid for practitioners within organisations. Design/methodology/approach – The research objectives were achieved by: reviewing and analysing current methodologies, and selecting a few frameworks against key performance indicators. Using a refined Delphi approach and semi-structured interview with the “experts” in the field. Intervention, case study and process research approach to evaluating a methodology. Findings – The BPI methodology was successfully formed and applied by the researcher and directly by the companies involved against the criteria of feasibility, usability and usefulness. Research limitations/implications – The paper has demonstrated a new knowledge on how to systematically assess a BPI methodology in practice. Practical implications – Model-based and integrated process improvement methodology (MIPI) methodology offers the practitioner (experienced and novice) a set of step-by-step aids necessary to make informed, consistent and efficient changes to business processes. Originality/value – The novelty of this research work is the creation of a holistic workbook-based methodology with relevant tools and techniques. It extends the capabilities of existing methodologies.
Resumo:
Refractive index and structural characteristics of optical polymers are strongly influenced by the thermal history of the material. Polymer optical fibres (POF) are drawn under tension, resulting in axial orientation of the polymer molecular chains due to their susceptibility to align in the fibre direction. This change in orientation from the drawing process results in residual strain in the fibre and also affects the transparency and birefringence of the material (1-3). PMMA POF has failure strain as high as over 100%. POF has to be drawn under low tension to achieve this value. The drawing tension affects the magnitude of molecular alignment along the fibre axis, thus affecting the failure strain. The higher the tension the lower the failure stain will be. However, the properties of fibre drawn under high tension can approach that of fibre drawn under low tension by means of an annealing process. Annealing the fibre can generally optimise the performance of POF while keeping most advantages intact. Annealing procedures can reduce index difference throughout the bulk and also reduce residual stress that may cause fracture or distortion. POF can be annealed at temperatures approaching the glass transition temperature (Tg) of the polymer to produce FBG with a permanent blue Bragg wave-length shift at room temperature. At this elevated temperature segmental motion in the structure results in a lower viscosity. The material softens and the molecular chains relax from the axial orientation causing shrinking of the fibre. The large attenuation of typically 1dB/cm in the 1550nm spectral region of PMMA POF has limited FBG lengths to less than 10cm. The more expensive fluorinated polymers with lower absorption have had no success as FBG waveguides. Bragg grating have been inscribed onto various POF in the 800nm spectral region using a 30mW continuous wave 325nm helium cadmium laser, with a much reduced attenuation coefficient of 10dB/m (5). Fabricating multiplexed FBGs in the 800nm spectral region in TOPAS and PMMA POF consistently has lead to fabrication of multiplexed FBG in the 700nm spectral region by a method of prolonged annealing. The Bragg wavelength shift of gratings fabricated in PMMA fibre at 833nm and 867nm was monitored whilst the POF was thermally annealed at 80°C. Permanent shifts exceeding 80nm into the 700nm spectral region was attained by both gratings on the fibre. The large permanent shift creates the possibility of multiplexed Bragg sensors operating over a broad range. -------------------------------------------------------------------------------------------------------------------- 1. Pellerin C, Prud'homme RE, Pézolet M. Effect of thermal history on the molecular orientation in polystyrene/poly (vinyl methyl ether) blends. Polymer. 2003;44(11):3291-7. 2. Dvoránek L, Machová L, Šorm M, Pelzbauer Z, Švantner J, Kubánek V. Effects of drawing conditions on the properties of optical fibers made from polystyrene and poly (methyl methacrylate). Die Angewandte Makromolekulare Chemie. 1990;174(1):25-39. 3. Dugas J, Pierrejean I, Farenc J, Peichot JP. Birefringence and internal stress in polystyrene optical fibers. Applied optics. 1994;33(16):3545-8. 4. Jiang C, Kuzyk MG, Ding JL, Johns WE, Welker DJ. Fabrication and mechanical behavior of dye-doped polymer optical fiber. Journal of applied physics. 2002;92(1):4-12. 5. Johnson IP, Webb DJ, Kalli K, Yuan W, Stefani A, Nielsen K, et al., editors. Polymer PCF Bragg grating sensors based on poly (methyl methacrylate) and TOPAS cyclic olefin copolymer2011: SPIE.
Resumo:
Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.