969 resultados para Processing technique of resin transfer molding (RTM)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We have used neutron reflectometry to characterize the swelling behaviour of brushes of poly[2-(diethyl amino)ethyl methacrylate], a polybase, as a function of pH. The brushes, synthesized by the "grafting from" method of atom transfer radical polymerization, were observed to approximately double their thickness in low pH solutions, although the pK is shifted to a lower pH than in dilute solution. The composition-depth profile obtained from the reflectometry experiments for the swollen brushes reveals a region depleted in polymer between the substrate and the extended part of the brush.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis explores the interaction between Micros (<10 employees) from non-creative sectors and website designers ("Creatives") that occurred when creating a website of a higher order than a basic template site. The research used Straussian Grounded Theory Method with a longitudinal design, in order to identify what knowledge transferred to the Micros during the collaboration, how it transferred, what factors affected the transfer and outcomes of the transfer including behavioural additionality. To identify whether the research could be extended beyond this, five other design areas were also examined, as well as five Small to Medium Enterprises (SMEs) engaged in website and branding projects. The findings were that, at the start of the design process, many Micros could not articulate their customer knowledge, and had poor marketing and visual language skills, knowledge core to web design, enabling targeted communication to customers through images. Despite these gaps, most Micros still tried to lead the process. To overcome this disjoint, the majority of the designers used a knowledge transfer strategy termed in this thesis as ‘Bi-Modal Knowledge Transfer’, where the Creative was aware of the transfer but the Micro was unaware, both for drawing out customer knowledge from the Micro and for transferring visual language skills to the Micro. Two models were developed to represent this process. Two models were also created to map changes in the knowledge landscapes of customer knowledge and visual language – the Knowledge Placement Model and the Visual Language Scale. The Knowledge Placement model was used to map the placement of customer knowledge within the consciousness, extending the known Automatic-Unconscious -Conscious model, adding two more locations – Peripheral Consciousness and Occasional Consciousness. Peripheral Consciousness is where potential knowledge is held, but not used. Occasional Consciousness is where potential knowledge is held but used only for specific tasks. The Visual Language Scale was created to measure visual language ability from visually responsive, where the participant only responds personally to visual symbols, to visually multi-lingual, where the participant can use visual symbols to communicate with multiple thought-worlds. With successful Bi-Modal Knowledge Transfer, the outcome included not only an effective website but also changes in the knowledge landscape for the Micros and ongoing behavioural changes, especially in marketing. These effects were not seen in the other design projects, and only in two of the SME projects. The key factors for this difference between SMEs and Micros appeared to be an expectation of knowledge by the Creatives and failure by the SMEs to transfer knowledge within the company.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Non-preemptive two-machine flow-shop scheduling problem with uncertain processing times of n jobs is studied. In an uncertain version of a scheduling problem, there may not exist a unique schedule that remains optimal for all possible realizations of the job processing times. We find necessary and sufficient conditions (Theorem 1) when there exists a dominant permutation that is optimal for all possible realizations of the job processing times. Our computational studies show the percentage of the problems solvable under these conditions for the cases of randomly generated instances with n ≤ 100 . We also show how to use additional information about the processing times of the completed jobs during optimal realization of a schedule (Theorems 2 – 4). Computational studies for randomly generated instances with n ≤ 50 show the percentage of the two- machine flow-shop scheduling problems solvable under the sufficient conditions given in Theorems 2 – 4.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nonlinear inverse synthesis (NIS) method, in which information is encoded directly onto the continuous part of the nonlinear signal spectrum, has been proposed recently as a promising digital signal processing technique for combating fiber nonlinearity impairments. However, because the NIS method is based on the integrability property of the lossless nonlinear Schrödinger equation, the original approach can only be applied directly to optical links with ideal distributed Raman amplification. In this paper, we propose and assess a modified scheme of the NIS method, which can be used effectively in standard optical links with lumped amplifiers, such as, erbium-doped fiber amplifiers (EDFAs). The proposed scheme takes into account the average effect of the fiber loss to obtain an integrable model (lossless path-averaged model) to which the NIS technique is applicable. We found that the error between lossless pathaveraged and lossy models increases linearly with transmission distance and input power (measured in dB). We numerically demonstrate the feasibility of the proposed NIS scheme in a burst mode with orthogonal frequency division multiplexing (OFDM) transmission scheme with advanced modulation formats (e.g., QPSK, 16QAM, and 64QAM), showing a performance improvement up to 3.5 dB; these results are comparable to those achievable with multi-step per span digital backpropagation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is distributed genetic algorithm implementation (so called island algorithm) to accelerate the optimum searching process in space of solutions. Distributed genetic algorithm has also smaller chances to fall in local optimum. This conception depends on mutual cooperation of the clients which realize separate working of genetic algorithms on local machines. As a tool for implementation of distributed genetic algorithm, created to produce net's applications Java technology was chosen. In Java technology, there is a technique of remote methods invocation - Java RMI. By means of invoking remote methods it can send objects between clients and server RMI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The given work is devoted to development of the computer-aided system of semantic text analysis of a technical specification. The purpose of this work is to increase efficiency of software engineering based on automation of semantic text analysis of a technical specification. In work it is offered and investigated a technique of the text analysis of a technical specification is submitted, the expanded fuzzy attribute grammar of a technical specification, intended for formalization of limited Russian language is constructed with the purpose of analysis of offers of text of a technical specification, style features of the technical specification as class of documents are considered, recommendations on preparation of text of a technical specification for the automated processing are formulated. The computer-aided system of semantic text analysis of a technical specification is considered. This system consist of the following subsystems: preliminary text processing, the syntactic and semantic analysis and construction of software models, storage of documents and interface.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The nonlinear Fourier transform, also known as eigenvalue communications, is a coding, transmission and signal processing technique that makes positive use of the nonlinear Kerr effect in fibre channels. I will discuss recent progress in this field. © 2015 OSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study examined the effects of financial aid on the persistence of associate of arts graduates transferring to a senior university in one of four consecutive fall semesters (1998-2001). Situated in an international metropolitan area in the southeastern United States, the institution where the study was conducted is a large public research university identified as a Hispanic Serving Institution. Archival databases served as the source of information on the academic and social background of the 4,669 participants in the study. Data from institutional financial aid records were pooled with the data in the student administrative system.^ For purposes of this study, persistence was defined as ongoing progress until completing the baccalaureate degree. Student social background variables used in the study were gender, ethnicity, age, and income, with GPA and part-time or full-time enrollment status being the academic variables. Amount and type of aid, including grants, loans, scholarships, and work study were incorporated in the models to determine the effect of financial aid on the persistence of these transfer students. Because the dependent variable persistence had three possible outcomes (graduated, still enrolled, dropped out) multinomial logistic regression was the appropriate technique for analyzing the data; four multinomial models were employed in the analysis.^ Findings suggest that grants awarded based on the financial need of students and loans were effective in encouraging the persistence of students, but scholarships and work study were not effective.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As massive data sets become increasingly available, people are facing the problem of how to effectively process and understand these data. Traditional sequential computing models are giving way to parallel and distributed computing models, such as MapReduce, both due to the large size of the data sets and their high dimensionality. This dissertation, as in the same direction of other researches that are based on MapReduce, tries to develop effective techniques and applications using MapReduce that can help people solve large-scale problems. Three different problems are tackled in the dissertation. The first one deals with processing terabytes of raster data in a spatial data management system. Aerial imagery files are broken into tiles to enable data parallel computation. The second and third problems deal with dimension reduction techniques that can be used to handle data sets of high dimensionality. Three variants of the nonnegative matrix factorization technique are scaled up to factorize matrices of dimensions in the order of millions in MapReduce based on different matrix multiplication implementations. Two algorithms, which compute CANDECOMP/PARAFAC and Tucker tensor decompositions respectively, are parallelized in MapReduce based on carefully partitioning the data and arranging the computation to maximize data locality and parallelism.