914 resultados para Level of processing
Resumo:
Despite the difficulties that we have regarding the use of English in tertiary education in Turkey, we argue that it is necessary for those involved to study in the medium of English. Furthermore, significant advances have been made on this front. These efforts have been for the most part language-oriented, but also include research into needs analysis and the pedagogy of team-teaching. Considering the current situation at this level of education, however, there still seems to be more to do. And the question is, what more can we do? What further contribution can we make? Or, how can we take this process further? The purpose of the study reported here is to respond to this last question. We test the proposition that it is possible to take this process further by investigating the efficient management of transition from Turkish-medium to English-medium at the tertiary level of education in Turkey. Beyond what is achieved by only the language orientation of the EAP approach, and moving conceptually deeper than what has been achieved by the team-teaching approach, the research undertaken for the purpose of this study focuses on the idea of the discourse community that people want to belong to. It then pursues an adaptation of the essentially psycho-social approach of apprenticeship, as people become aspirants and apprentices to that discourse community. In this thesis, the researcher recognises that she cannot follow all the way through to the full implementation of her ideas in a fully-taught course. She is not in a position to change the education system. What she does here is to introduce a concept and sample its effects in terms of motivation, and thereby of integration and of success, for individuals and groups of learners. Evaluation is provided by acquiring both qualitative and quantitative data concerning mature members' perceptions of apprenticed-neophytes functioning as members in the new community, apprenticed-neophytes' perceptions of their own membership and of the preparation process undertaken, and the comparison of these neophytes' performance with that of other neophytes in the community. The data obtained provide strong evidence in support of the potential usefulness of this apprenticeship model towards the declared purpose of improving the English-medium tertiary education of Turkish students in their chosen fields of study.
Resumo:
The main aim of this work was two fold, firstly to investigate the effect of a highly reactive comonomer, divinylbenzene (DVB), on the extent of melt grafting of glycidyl methacrylate (GMA) on ethylene-propylene rubber (EPR) using 2,5-dimethyl-2,5-bis-(tert-butyl peroxy) hexane (Trigon ox 101, Tl 01) as a free radical initiator, and to compare the results with a conventional grafting of the same monomer on EPR. To achieve this, the effect of processing conditions and chemical composition including the concentration of peroxide, GMA and DVB on the extent of grafting was investigated. The presence of the comonomer (DVB) in the grafting process resulted in a significant increase in the extent of the grafting using only a small concentration of peroxide. It was also found that the extent of grafting increased drastically with increasing the DVB concentration. Interestingly, in the comonomer system, the extent of the undesired side reaction, normally the homopolymerisation of GMA (polyGMA) was shown to have reduced tremendously and in most cases the level of polyGMA was immeasurable in the samples. Compared to a conventional EPR-g-GMACONV (in the absence of a comonomer), the presence of the comonomer DVB in the grafting system was shown to result in more branching and crosslinking (shown from an increase in melt flow index (MFI) and torque values) and this was paralleled by an increase in DVB concentration. In contrast, the extent of grafting in conventional system increased with increasing the peroxide concentration but the level of grafting was much lower than in the case of DVB. Homopolymerisation of GMA and excessive crosslinking of EPR became dominant at high peroxide concentration and this. reflects that the side reactions were favorable in the conventional grafting system. The second aim was to examine the effect of the in-situ functionalised EPR when used as a compatibiliser for binary blends. It was found that blending PET with functionalised EPR (ƒ-EPR) gave a significant improvement in terms of blend morphology as well as mechanical properties. The results showed clearly that, blending PET with ƒ-EPRDVB (prepared with DVB) was much more effective compared to the corresponding PET/ƒ-EPRCONV (without DVB) blends in which ƒ-EPRDVB having optimum grafting level of 2.1 wt% gave the most pronounced effect on the morphology and mechanical properties. On the other hand, blends of PET/ƒ-EPRDVB containing high GMA/DVB ratio was found to be unfavorable hence exhibited lower tensile properties and showed unfavorable morphology. The presence of high polyGMA concentration in ƒ-EPRCONV was found to create damaging effect on its morphology, hence resulting in reduced tensile properties (e.g. low elongation at break). However, the use of commercial terpolymers based on ethylene-methacrylate-glycidyl methacrylate (EM-GMA)or a copolymer of ethylene-glycidyl methacrylate (E-GMA) containing various GMA levels as compatibilisers in PET/EPR blends was found to be more efficient compared to PET/EPR/ƒ-EPR blends with the former blends showing finer morphology and high elongation at break. The high efficiency of the terpolymers or copolymers in compatibilising the PET/EPR blends is suggested to be partly, higher GMA content compared to the amount in ƒ-EPR and due to its low viscosity.
Resumo:
In order to generate sales promotion response predictions, marketing analysts estimate demand models using either disaggregated (consumer-level) or aggregated (store-level) scanner data. Comparison of predictions from these demand models is complicated by the fact that models may accommodate different forms of consumer heterogeneity depending on the level of data aggregation. This study shows via simulation that demand models with various heterogeneity specifications do not produce more accurate sales response predictions than a homogeneous demand model applied to store-level data, with one major exception: a random coefficients model designed to capture within-store heterogeneity using store-level data produced significantly more accurate sales response predictions (as well as better fit) compared to other model specifications. An empirical application to the paper towel product category adds additional insights. This article has supplementary material online.
Resumo:
The main aim of this work was to investigate the effect of a highly reactive comonomer, divinylbenzene (DVB), on the extent of melt grafting of glycidyl methacrylate (GMA) on ethylene-propylene rubber (EPR) using 2,5-dimethyl-2,5-bis-(tert-butyl peroxy) hexane (Trigonox 101, T101) as a free radical initiator, and to compare the results with a conventional grafting of the same monomer on EPR. To achieve this, the effect of processing conditions and chemical composition including the concentration of peroxide, GMA and DVB on the extent of grafting was investigated. The presence of the comonomer (DVB) in the grafting process resulted in a significant increase in the extent of the grafting using only a small concentration of peroxide. It was also found that the extent of grafting increased drastically with the increasing DVB concentration. Interestingly, in the comonomer system, the extent of the undesired side reaction, normally the homopolymerisation of GMA (polyGMA) was shown to have reduced tremendously and in most cases the level of polyGMA was immeasurable in the samples. In contrast, the extent of grafting in conventional system increased with increasing the peroxide concentration but the level of grafting was much lower than in the case of DVB. Homopolymerisation of GMA and excessive crosslinking of EPR became dominant at high peroxide concentration and this reflects that the side reactions were favourable in the conventional grafting system.
Resumo:
This thesis makes a contribution to the Change Data Capture (CDC) field by providing an empirical evaluation on the performance of CDC architectures in the context of realtime data warehousing. CDC is a mechanism for providing data warehouse architectures with fresh data from Online Transaction Processing (OLTP) databases. There are two types of CDC architectures, pull architectures and push architectures. There is exiguous data on the performance of CDC architectures in a real-time environment. Performance data is required to determine the real-time viability of the two architectures. We propose that push CDC architectures are optimal for real-time CDC. However, push CDC architectures are seldom implemented because they are highly intrusive towards existing systems and arduous to maintain. As part of our contribution, we pragmatically develop a service based push CDC solution, which addresses the issues of intrusiveness and maintainability. Our solution uses Data Access Services (DAS) to decouple CDC logic from the applications. A requirement for the DAS is to place minimal overhead on a transaction in an OLTP environment. We synthesize DAS literature and pragmatically develop DAS that eciently execute transactions in an OLTP environment. Essentially we develop effeicient RESTful DAS, which expose Transactions As A Resource (TAAR). We evaluate the TAAR solution and three pull CDC mechanisms in a real-time environment, using the industry recognised TPC-C benchmark. The optimal CDC mechanism in a real-time environment, will capture change data with minimal latency and will have a negligible affect on the database's transactional throughput. Capture latency is the time it takes a CDC mechanism to capture a data change that has been applied to an OLTP database. A standard definition for capture latency and how to measure it does not exist in the field. We create this definition and extend the TPC-C benchmark to make the capture latency measurement. The results from our evaluation show that pull CDC is capable of real-time CDC at low levels of user concurrency. However, as the level of user concurrency scales upwards, pull CDC has a significant impact on the database's transaction rate, which affirms the theory that pull CDC architectures are not viable in a real-time architecture. TAAR CDC on the other hand is capable of real-time CDC, and places a minimal overhead on the transaction rate, although this performance is at the expense of CPU resources.
Resumo:
Metallocene ethylene-1-octene copolymers having different densities and comonomer content ranging from 11 to 36 wt% (m-LLDPE), and a Ziegler copolymer (z-LLDPE) containing the same level of short-chain branching (SCB) corresponding to one of the m-LLDPE polymers, were subjected to extrusion. The effects of temperature (210-285 °C) and multi-pass extrusions (up to five passes) on the rheological and structural characteristics of these polymers were investigated using melt index and capillary rheometry, along with spectroscopic characterisation of the evolution of various products by FTIR, C-NMR and colour measurements. The aim is to develop a better understanding of the effects of processing variables on the structure and thermal degradation of these polymers. Results from rheology show that both extrusion temperature and the amount of comonomer have a significant influence on the polymer melt thermo-oxidative behaviour. At low to intermediate processing temperatures, all m-LLDPE polymers exhibited similar behaviour with crosslinking reactions dominating their thermal oxidation. By contrast, at higher processing temperatures, the behaviour of the metallocene polymers changed depending on the level of comonomer content: higher SCB gave rise to predominantly chain scission reactions whereas polymers with lower level of SCB continued to be dominated by crosslinking. This temperature dependence was attributed to changes in the different evolution of carbonyl and unsaturated compounds including vinyl, vinylidene and trans-vinylene. © 2007 Elsevier Ltd. All rights reserved.
Resumo:
Fatigue crack propagation and threshold data for two Ni-base alloys, Astroloy and Nimonic 901, are reported. At room temperature the effect which altering the load ratio (R-ratio) has on fatigue behaviour is strongly dependent on grain size. In the coarse grained microstructures crack growth rates increase and threshold values decrease markedly as R rises from 0. 1 to 0. 8, whereas only small changes in behaviour occur in fine grained material. In Astroloy, when strength level and gamma grain size are kept constant, there is very little effect of processing route and gamma prime distribution on room temperature threshold and crack propagation results. The dominant microstructural effect on this type of fatigue behaviour is the matrix ( gamma ) grain size itself.
Resumo:
In the Institute of Cybernetics of National Academy of Sciences of Ukraine the smart portable fluorometer for express-diagnostics of photosynthesis was designed. The device allows easy to estimate the level of influence of natural environment and pollutions to alive plants. The device is based on real time processing of the curve of chlorophyll fluorescent induction. The principles of operation and results of experimental researches of device are described in the article.
Resumo:
Complex Event processing (CEP) has emerged over the last ten years. CEP systems are outstanding in processing large amount of data and responding in a timely fashion. While CEP applications are fast growing, performance management in this area has not gain much attention. It is critical to meet the promised level of service for both system designers and users. In this paper, we present a benchmark for complex event processing systems: CEPBen. The CEPBen benchmark is designed to evaluate CEP functional behaviours, i.e., filtering, transformation and event pattern detection and provides a novel methodology of evaluating the performance of CEP systems. A performance study by running the CEPBen on Esper CEP engine is described and discussed. The results obtained from performance tests demonstrate the influences of CEP functional behaviours on the system performance. © 2014 Springer International Publishing Switzerland.
Resumo:
We explored the role of modularity as a means to improve evolvability in populations of adaptive agents. We performed two sets of artificial life experiments. In the first, the adaptive agents were neural networks controlling the behavior of simulated garbage collecting robots, where modularity referred to the networks architectural organization and evolvability to the capacity of the population to adapt to environmental changes measured by the agents performance. In the second, the agents were programs that control the changes in network's synaptic weights (learning algorithms), the modules were emerged clusters of symbols with a well defined function and evolvability was measured through the level of symbol diversity across programs. We found that the presence of modularity (either imposed by construction or as an emergent property in a favorable environment) is strongly correlated to the presence of very fit agents adapting effectively to environmental changes. In the case of learning algorithms we also observed that character diversity and modularity are also strongly correlated quantities. © 2014 Springer Science+Business Media New York.
Resumo:
The activities of the Institute of Information Technologies in the area of automatic text processing are outlined. Major problems related to different steps of processing are pointed out together with the shortcomings of the existing solutions.
Resumo:
Fluorescence spectroscopy has recently become more common in clinical medicine. However, there are still many unresolved issues related to the methodology and implementation of instruments with this technology. In this study, we aimed to assess individual variability of fluorescence parameters of endogenous markers (NADH, FAD, etc.) measured by fluorescent spectroscopy (FS) in situ and to analyse the factors that lead to a significant scatter of results. Most studied fluorophores have an acceptable scatter of values (mostly up to 30%) for diagnostic purposes. Here we provide evidence that the level of blood volume in tissue impacts FS data with a significant inverse correlation. The distribution function of the fluorescence intensity and the fluorescent contrast coefficient values are a function of the normal distribution for most of the studied fluorophores and the redox ratio. The effects of various physiological (different content of skin melanin) and technical (characteristics of optical filters) factors on the measurement results were additionally studied.The data on the variability of the measurement results in FS should be considered when interpreting the diagnostic parameters, as well as when developing new algorithms for data processing and FS devices.
Resumo:
We introduce a novel algorithm for medial surfaces extraction that is based on the density-corrected Hamiltonian analysis. The approach extracts the skeleton directly from a triangulated mesh and adopts an adaptive octree-based approach in which only skeletal voxels are refined to a lower level of the hierarchy, resulting in robust and accurate skeletons at extremely high resolution. The quality of the extracted medial surfaces is confirmed by an extensive set of experiments. © 2012 IEEE.
Resumo:
A cikk a magyarországi kis- és középvállalkozások logisztikai költségeit vizsgálja a nemzetközi kutatások tükrében és a 2009-es közel kétezer hazai KKV-re kiterjedő „Vállalkozások helyzetének felmérése” alapján. A hazai logisztikai költségeket leginkább a vállalatméret és az ágazat befolyásolja. A magyarországi kis- és középvállalkozások ki nem szervezett szállítási és raktározási tevékenységei a hazai fuvarozóknak és raktárszolgáltatóknak jelentős piacbővülést jelenthetnének. Ez azonban csak látens piacbővülés, mivel a hazai KKV-k nem tervezik, hogy szállítási és raktározási tevékenységeiket nagyobb arányban szervezzék ki. Ez részben azzal magyarázható, hogy a feldolgozóipari, mezőgazdasági és kereskedő kis- és középvállalkozások viszonylag nagy arányban tekintik a logisztikát alapvető képességnek. Nemzetközi viszonylatban a magyarországi KKV-k magas logisztikai költségszintekkel szembesülnek, melynek mérséklésével az adminisztrációs terhek csökkentéséhez hasonló megtakarítást lehetne elérni. ____ This article examines total logistics costs and its components of the Hungarian small- and medium-sized enterprises in the light of international researches. It shows that company-size and its sector are the most important drivers of logistics costs and its components according to „Survey of position of enterprises”, which covered nearly 2000 Hungarian SMEs in 2009. It also proves that transport and warehousing demand of the Hungarian SMEs means a significant market growth for national carriers and warehouse providers. It is only a latent growth, because the Hungarian SMEs do not plan to outsource their transport and warehouse activities in a greater extent. The relative high level of logistics as a core competence among processing industry, agriculture and trade SMEs can be a partial explanation to this. The Hungarian small- and medium-sized enterprises face with high logistics costs in international comparison. Its reduction may bring similar savings as reduction of administrative burdens.
Resumo:
The purpose of this dissertation was to examine the form of the consumer satisfaction/dissatisfaction (CS/D) response to disconfirmation. In addition, the cognitive and affective processes underlying the response were also explored. ^ Respondents were provided with information from a prior market research study about a new brand of printer that was being tested. This market research information helped set prior expectations regarding the print quality. Subjects were randomly assigned to an experimental condition that manipulated prior expectations to be either positive or negative. Respondents were then provided with printouts that had performance quality that was either worse (negative disconfirmation) or better (positive disconfirmation) than the prior expectations. In other words, for each level of expectation, respondents were assigned to either positive or negative disconfirmation condition. Subjects were also randomly assigned to a condition of either a high or low level of outcome involvement. ^ Analyses of variance indicated that positive disconfirmation led to a more intense CS/D response than negative disconfirmation, even though there was no significant difference in the intensity for positive and negative disconfirmation. Intensity of CS/D was measured by the distance of the CS/D rating from the midpoint of the scale. The study also found that although outcome involvement did not influence the polarity of the CS/D response, the more direct measures of processing involvement such as the subjects' concentration, attention and care in evaluating the printout did have a significant positive effect on CS/D intensity. ^ Analyses of covariance also indicated that the relationship between the intensity of the CS/D response and the intensity of the disconfirmation was mediated by the intensity of affective responses. Positive disconfirmation led to more intense affective responses than negative disconfirmation. ^