859 resultados para Work Performance
High-Performance-Tensile-Strength Alpha-Grass Reinforced Starch-Based Fully Biodegradable Composites
Resumo:
Though there has been a great deal of work concerning the development of natural fibers in reinforced starch-based composites, there is still more to be done. In general, cellulose fibers have lower strength than glass fibers; however, their specific strength is not far from that of fiberglass. In this work, alpha-fibers were obtained from alpha-grass through a mild cooking process. The fibers were used to reinforce a starch-based biopolymer. Composites including 5 to 35% (w/w) alpha-grass fibers in their formulation were prepared, tested, and subsequently compared with those of wood- and fiberglass-reinforced polypropylene (PP). The term “high-performance” refers to the tensile strength of the studied composites and is mainly due to a good interphase, a good dispersion of the fibers inside the matrix, and a good aspect ratio. The tensile strength of the composites showed a linear evolution for fiber contents up to 35% (w/w). The strain at break of the composites decreased with the fiber content and showed the stiffening effects of the reinforcement. The prepared composites showed high mechanical properties, even approaching those of glass fiber reinforced composites
Resumo:
Centrifugal compressors are widely used for example in refrigeration processes, the oil and gas industry, superchargers, and waste water treatment. In this work, five different vaneless diffusers and six different vaned diffusers are investigated numerically. The vaneless diffusers vary only by their diffuser width, so that four of the geometries have pinch implemented to them. Pinch means a decrease in the diffuser width. Four of the vaned diffusers have the same vane turning angle and a different number of vanes, and two have different vane turning angles. The flow solver used to solve the flow fields is Finflo, which is a Navier-Stokes solver. All the cases are modeled with the Chien's k – έ- turbulence model, and selected cases are modeled also with the k – ώ-SST turbulence model. All five vaneless diffusers and three vaned diffusers are investigated also experimentally. For each construction, the compressor operating map is measured according to relevant standards. In addition to this, the flow fields before and after the diffuser are measured with static and total pressure, flow angle and total temperature measurements. When comparing the computational results to the measured results, it is evident that the k – ώ-SST turbulence model predicts the flow fields better. The simulation results indicate that it is possible to improve the efficiency with the pinch, and according to the numerical results, the two best geometries are the ones with most pinch at the shroud. These geometries have approximately 4 percentage points higher efficiency than the unpinched vaneless diffusers. The hub pinch does not seem to have any major benefits. In general, the pinches make the flow fields before and after the diffuser more uniform. The pinch also seems to improve the impeller efficiency. This is down to two reasons. The major reason is that the pinch decreases the size of slow flow and possible backflow region located near the shroud after the impeller. Secondly, the pinches decrease the flow velocity in the tip clearance, leading to a smaller tip leakage flow and therefore slightly better impeller efficiency. Also some of the vaned diffusers improve the efficiency, the increment being 1...3 percentage points, when compared to the vaneless unpinched geometry. The measurement results confirm that the pinch is beneficial to the performance of the compressor. The flow fields are more uniform with the pinched cases, and the slow flow regions are smaller. The peak efficiency is approximately 2 percentage points and the design point efficiency approximately 4 percentage points higher with the pinched geometries than with the un- pinched geometry. According to the measurements, the two best geometries are the ones with the most pinch at the shroud, the case with the pinch only at the shroud being slightly better of the two. The vaned diffusers also have better efficiency than the vaneless unpinched geometries. However, the pinched cases have even better efficiencies. The vaned diffusers narrow the operating range considerably, whilst the pinch has no significant effect on the operating range.
Resumo:
Actualment un típic embedded system (ex. telèfon mòbil) requereix alta qualitat per portar a terme tasques com codificar/descodificar a temps real; han de consumir poc energia per funcionar hores o dies utilitzant bateries lleugeres; han de ser el suficientment flexibles per integrar múltiples aplicacions i estàndards en un sol aparell; han de ser dissenyats i verificats en un període de temps curt tot i l’augment de la complexitat. Els dissenyadors lluiten contra aquestes adversitats, que demanen noves innovacions en arquitectures i metodologies de disseny. Coarse-grained reconfigurable architectures (CGRAs) estan emergent com a candidats potencials per superar totes aquestes dificultats. Diferents tipus d’arquitectures han estat presentades en els últims anys. L’alta granularitat redueix molt el retard, l’àrea, el consum i el temps de configuració comparant amb les FPGAs. D’altra banda, en comparació amb els tradicionals processadors coarse-grained programables, els alts recursos computacionals els permet d’assolir un alt nivell de paral•lelisme i eficiència. No obstant, els CGRAs existents no estant sent aplicats principalment per les grans dificultats en la programació per arquitectures complexes. ADRES és una nova CGRA dissenyada per I’Interuniversity Micro-Electronics Center (IMEC). Combina un processador very-long instruction word (VLIW) i un coarse-grained array per tenir dues opcions diferents en un mateix dispositiu físic. Entre els seus avantatges destaquen l’alta qualitat, poca redundància en les comunicacions i la facilitat de programació. Finalment ADRES és un patró enlloc d’una arquitectura concreta. Amb l’ajuda del compilador DRESC (Dynamically Reconfigurable Embedded System Compile), és possible trobar millors arquitectures o arquitectures específiques segons l’aplicació. Aquest treball presenta la implementació d’un codificador MPEG-4 per l’ADRES. Mostra l’evolució del codi per obtenir una bona implementació per una arquitectura donada. També es presenten les característiques principals d’ADRES i el seu compilador (DRESC). Els objectius són de reduir al màxim el nombre de cicles (temps) per implementar el codificador de MPEG-4 i veure les diferents dificultats de treballar en l’entorn ADRES. Els resultats mostren que els cícles es redueixen en un 67% comparant el codi inicial i final en el mode VLIW i un 84% comparant el codi inicial en VLIW i el final en mode CGA.
Resumo:
This is a study of team social networks, their antecedents and outcomes. In focusing attention on the structural configuration of the team this research contributes to a new wave of thinking concerning group social capital. The research site was a random sample of Finnish work organisations. The data consisted of 499 employees in 76 teams representing 48 different organisations. A systematic literature review and quantitative methods were used in conducting the research: the former primarily to establish the current theoretical position on the relationships among the variables and the latter to test these relationships. Social network analysis was the primary method used in identifying the social-network relations among the work-team members. The first and key contribution of this study is that it relates the structuralnetwork properties of work teams to behavioural outcomes, attitudinal outcomes and, ultimately, team performance. Moreover, it shows that addressing attitudinal outcomes is also important in terms of team performance; attitudinal outcomes (team identity) mediated the relationship between the team’s performance and its social network. The second contribution is that it examines the possible antecedents of the social structure. It is thus one response to Salancik’s (1995) call for a network theory in that it explains why certain network characteristics exist. Itdemonstrates that irrespective of whether or not a team is heterogeneous in terms of age or gender, educational diversity may protect it from centralisation. However, heterogeneity in terms of gender turned out to have a negative impact on density. Thirdly, given the observation that the benefits of (team) networks are typically theorised and modelled without reference to the nature of the relationships comprising the structure, the study directly tested whether team knowledge mediated the effects of instrumental and expressive network relationships on team performance. Furthermore, with its focus on expressive networks that link the workplace to a more informal world, which have been rather neglected in previous research, it enhances knowledge of teams andnetworks. The results indicate that knowledge sharing fully mediates the influence of complementarities between dense and fragmented instrumental network relationships, thus providing empirical validation of the implicit understanding that networks transfer knowledge. Fourthly, the study findings suggest that an optimal configuration of the work-team social-network structure combines both bridging and bonding social relationships.
Resumo:
This work focused on the development and validation of an RP-HPLC-UV method for quantification of beta-lactam antibiotics in three pharmaceutical samples. Active principles analyzed were amoxicillin and ampicillin, in 3 veterinary drugs. Mobile phase comprised 5 mmol L-1 phosphoric acid solution at pH 2.00, acetonitrile with gradient elution mode and detection wavelength at 220 nm. The method was validated according to the Brazilian National Health Surveillance regulation, where linear range and linearity, selectivity, precision, accuracy and ruggedness were evaluated. Inter day precision and accuracy for pharmaceutical samples 1, 2 and 3 were: 1.43 and 1.43%; 4.71 and 3.74%; 2.72 and 1.72%, respectively, while regression coefficients for analytical curves exceeded 0.99. The method had acceptable merit figure values, indicating reliable quantification. Analyzed samples had active principle concentrations varying from -12 to +21% compared to manufacturer label claims, rendering the medicine unsafe for administration to animals.
Resumo:
@450 wireless broadband service is Digita’s mobile wireless broadband network service. In @450 network Digita acts as the network operator offering network capacity to service operators. For Digita it is important to know what kind of services its network is capable of and what are the network’s service parameters. The knowledge of the network parameters and the behaviour can be used in advance in the development of new service products. Before a new service product can be offered to service operators a lot of work has to be done. The basic testing is necessary to get an understanding of the basic functionality. The requirement specification has to be done and a new product has to be created. The new product has to be tested. The test results have to be analysed in order to find out if the new product is suitable for real use and with which limitations. The content of this Thesis is the development of wireless technologies, @450 service and network, FLASH-OFDM technology, FLASH-OFDM performance testing and the development of a new service product.
Resumo:
The topic of this thesis is the simulation of a combination of several control and data assimilation methods, meant to be used for controlling the quality of paper in a paper machine. Paper making is a very complex process and the information obtained from the web is sparse. A paper web scanner can only measure a zig zag path on the web. An assimilation method is needed to process estimates for Machine Direction (MD) and Cross Direction (CD) profiles of the web. Quality control is based on these measurements. There is an increasing need for intelligent methods to assist in data assimilation. The target of this thesis is to study how such intelligent assimilation methods are affecting paper web quality. This work is based on a paper web simulator, which has been developed in the TEKES funded MASI NoTes project. The simulator is a valuable tool in comparing different assimilation methods. The thesis contains the comparison of four different assimilation methods. These data assimilation methods are a first order Bayesian model estimator, an ARMA model based on a higher order Bayesian estimator, a Fourier transform based Kalman filter estimator and a simple block estimator. The last one can be considered to be close to current operational methods. From these methods Bayesian, ARMA and Kalman all seem to have advantages over the commercial one. The Kalman and ARMA estimators seems to be best in overall performance.
Resumo:
In the study the recently developed concept of strategic entrepreneurship was addressed with the aim to investigate the underlying factors and components constituting the concept and their influence on firm performance. As the result of analysis of existing literature and empirical studies the model of strategic entrepreneurship for the current study is developed with the emphasis on exploration and exploitation parts of the concept. The research model is tested on the data collected in the project ―Factors of growth and success of entrepreneurial firms in Russia‖ by Center for Entrepreneurship of GSOM in 2007 containing answers of owners and managers of 500 firms operating in St. Petersburg and Moscow. Multiple regression analysis showed that exploration and exploitation presented by entrepreneurial values, investments in internal resources, knowledge management and developmental changes are significant factors constituting strategic entrepreneurship and having positive relation to firm performance. The theoretical contribution of the work is linked to development and testing of the model of strategic entrepreneurship. The results can be implemented in management practices of companies willing to engage in strategic entrepreneurship and increase their firm performance.
Resumo:
Machine learning provides tools for automated construction of predictive models in data intensive areas of engineering and science. The family of regularized kernel methods have in the recent years become one of the mainstream approaches to machine learning, due to a number of advantages the methods share. The approach provides theoretically well-founded solutions to the problems of under- and overfitting, allows learning from structured data, and has been empirically demonstrated to yield high predictive performance on a wide range of application domains. Historically, the problems of classification and regression have gained the majority of attention in the field. In this thesis we focus on another type of learning problem, that of learning to rank. In learning to rank, the aim is from a set of past observations to learn a ranking function that can order new objects according to how well they match some underlying criterion of goodness. As an important special case of the setting, we can recover the bipartite ranking problem, corresponding to maximizing the area under the ROC curve (AUC) in binary classification. Ranking applications appear in a large variety of settings, examples encountered in this thesis include document retrieval in web search, recommender systems, information extraction and automated parsing of natural language. We consider the pairwise approach to learning to rank, where ranking models are learned by minimizing the expected probability of ranking any two randomly drawn test examples incorrectly. The development of computationally efficient kernel methods, based on this approach, has in the past proven to be challenging. Moreover, it is not clear what techniques for estimating the predictive performance of learned models are the most reliable in the ranking setting, and how the techniques can be implemented efficiently. The contributions of this thesis are as follows. First, we develop RankRLS, a computationally efficient kernel method for learning to rank, that is based on minimizing a regularized pairwise least-squares loss. In addition to training methods, we introduce a variety of algorithms for tasks such as model selection, multi-output learning, and cross-validation, based on computational shortcuts from matrix algebra. Second, we improve the fastest known training method for the linear version of the RankSVM algorithm, which is one of the most well established methods for learning to rank. Third, we study the combination of the empirical kernel map and reduced set approximation, which allows the large-scale training of kernel machines using linear solvers, and propose computationally efficient solutions to cross-validation when using the approach. Next, we explore the problem of reliable cross-validation when using AUC as a performance criterion, through an extensive simulation study. We demonstrate that the proposed leave-pair-out cross-validation approach leads to more reliable performance estimation than commonly used alternative approaches. Finally, we present a case study on applying machine learning to information extraction from biomedical literature, which combines several of the approaches considered in the thesis. The thesis is divided into two parts. Part I provides the background for the research work and summarizes the most central results, Part II consists of the five original research articles that are the main contribution of this thesis.
Resumo:
ABSTRACT This work aimed to evaluate the consequences of the monthly extraction of immature leaves in survival, leaf production and reproductive performance ofCopernicia prunifera H. E. Moore palm, popularly known as carnaúba. One hundred sixty reproductive adult palms were monitored for 17 months in four extractive communities located at the coast of the state of Piauí. As a result, it was observed that leaves, flowers, fruits and seeds production were reduced in the palm submitted to 50% or 75% monthly extraction. Higher levels of extraction were followed by smaller levels in seed germination. No deaths were observed even in the group subjected to 75% monthly leaves exploration. In order not to produce damage to palm trees development it is suggested that leaf extraction rate should not exceed 25% monthly as well as pausing of extractive activity preferentially during fruit maturation.
Resumo:
Potato is an important crop plant throughout the world. Harvesting is a fundamental step in its production system. Maybe, it is the most complex and expensive operation. Thus, the objective of this work was to compare the cost of the mechanized and semi-mechanized harvest, the operational capacity and the production losses during the potato harvest process. The work was accomplished in a commercial farming, cultivated under pivot system, in the municipal district of Perdizes - MG, Brazil. A completely randomized design with two treatments was used: mechanized and semi-mechanized harvest. The mechanized harvest used a self-propelled harvester. In the semi-automated harvest, a digger mounted on tractor was used and the potato was manually harvested. It was concluded that the cost of mechanized harvest was 49.03% lower than the cost of semi-mechanized harvest. On average, the harvester had a work for 23 workers in manual harvest. Mechanized harvest showed losses of 2.35% of potato yield, while the semi-mechanized harvest showed losses of 6.32%.
Resumo:
The present work aimed to evaluate the tractive performance of four agricultural tractors with auxiliary front traction in function of six lateral inclinations in a lateral track of trials, which belongs to the Agronomic Sciences Faculty from the São Paulo State University, Campus of Botucatu. The lateral inclinations were 0; 5; 10; 15; 20 and 25 degrees. In all of these situations, the tractors operated with predetermined load of an imposed traction to the pulled tractor of 40 kN in the inclination of 0 degrees. Hence a delineation in random blocks was used, considering six inclinations and four tractors, and with three repetitions for each treatment. The analyzed variables were slippage, tractive force, hourly fuel consumption, and speed. It was concluded that the pneumatic tire configurations influenced in the tractive performance of the tractors, as they increased the lateral inclinations of the terrain.
Resumo:
The use of renewable fuels, such as the biodiesel, can ease the demand of fossil fuel for the power generation and transportation fields in rural area. In this work, the performance impact of the application of castor oil biodiesel is evaluated with an automotive and a stationary diesel engine. The application of B20 and B10 biodiesel blends and pre-heated net biodiesel is considered. The viability of the employment of B10 and B20 blends to mobility and power generation was observed from dynamometric bench tests, where this blends performed similar to fossil diesel. With the pre-heated net biodiesel, however, a brake torque loss and a specific consumption increase were observed with relation to diesel fuel.
Resumo:
The study touches upon marketing-sales departments’ cooperation and investigates marketing-sales cooperative model within the case company. So that research increases understanding of linkages between Marketing and Sales departments with an illustrative example of Russian medium-sized oil company (LLC Neste St. Petersburg), the subsidiary of Finnish-based Neste Oil. The empirical study is done from marketing and sales perspectives. And for sales main attention was brought to direct sales, both B2B and B2C. Research considers all five domains of cooperation, and among others, study reveals the attitude towards external (market) and internal (product) knowledge, and its mutual use by marketing and sales managers. A qualitative research method, participant observations, and in-depth interviews with upper-management made it possible to explore all facets of joint work. Moreover, research responses the changes in a model of cooperation between marketing and sales when moving from medium size to large company.
Resumo:
Fast changing environment sets pressure on firms to share large amount of information with their customers and suppliers. The terms information integration and information sharing are essential for facilitating a smooth flow of information throughout the supply chain, and the terms are used interchangeably in research literature. By integrating and sharing information, firms want to improve their logistics performance. Firms share information with their suppliers and customers by using traditional communication methods (telephone, fax, Email, written and face-to-face contacts) and by using advanced or modern communication methods such as electronic data interchange (EDI), enterprise resource planning (ERP), web-based procurement systems, electronic trading systems and web portals. Adopting new ways of using IT is one important resource for staying competitive on the rapidly changing market (Saeed et al. 2005, 387), and an information system that provides people the information they need for performing their work, will support company performance (Boddy et al. 2005, 26). The purpose of this research has been to test and understand the relationship between information integration with key suppliers and/or customers and a firm’s logistics performance, especially when information technology (IT) and information systems (IS) are used for integrating information. Quantitative and qualitative research methods have been used to perform the research. Special attention has been paid to the scope, level and direction of information integration (Van Donk & van der Vaart 2005a). In addition, the four elements of integration (Jahre & Fabbe-Costes 2008) are closely tied to the frame of reference. The elements are integration of flows, integration of processes and activities, integration of information technologies and systems and integration of actors. The study found that information integration has a low positive relationship to operational performance and a medium positive relationship to strategic performance. The potential performance improvements found in this study vary from efficiency, delivery and quality improvements (operational) to profit, profitability or customer satisfaction improvements (strategic). The results indicate that although information integration has an impact on a firm’s logistics performance, all performance improvements have not been achieved. This study also found that the use of IT and IS have a mediocre positive relationship to information integration. Almost all case companies agreed on that the use of IT and IS could facilitate information integration and improve their logistics performance. The case companies felt that an implementation of a web portal or a data bank would benefit them - enhance their performance and increase information integration.