869 resultados para NETWORK DESIGN PROBLEMS


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This thesis studies the problems and their reasons a software architect faces in his work. The purpose of the study is to search and identify potential factors causing problens in system integration and software engineering. Under a special interest are non-technical factors causing different kinds of problems. Thesis was executed by interviewing professionals that took part in e-commerce project in some corporation. Interviewed professionals consisted of architects from technical implementation projects, corporation's architect team leader, different kind of project managers and CRM manager. A specific theme list was used as an guidance of the interviews. Recorded interviews were transcribed and then classified using ATLAS.ti software. Basics of e-commerce, software engineering and system integration is described too. Differences between e-commerce and e-business as well as traditional business are represented as are basic types of e-commerce. Software's life span, general problems of software engineering and software design are covered concerning software engineering. In addition, general problems of the system integration and the special requirements set by e-commerce are described in the thesis. In the ending there is a part where the problems founded in study are described and some areas of software engineering where some development could be done so that same kind of problems could be avoided in the future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The presence of e-portfolios in educational centres, companies and administrations has emergedstrongly during the last years by creating very different practices coming from different objectives and purposes. This situation has led researchers and practitioners to design and implement e-portfolios with little reference to previous knowledge of them; consequently, developments are disparate with many of the processes and dimensions used both in development and use being unnecessary complex. In order to minimize the inconveniences, unify these developmental processes and improve the resultsof implementation and use of e-portfolios, it seemed necessary to create a network of researchers, teachers and trainers coming from different universities and institutions of different kinds who are interested in the investigation and the practice of e-portfolios in Spain. Therefore, The Network on e-portfoliowas created in 2006, funded by the Spanish Ministry of Education and led by the UniversitatOberta de Catalunya. Besides the goals associatedwith the creation of this network and which wewanted to share with other European researchers and experts of other continents, we will also present in this paper some data concerned with the first study carried out on the use of e-portfolios in our country that shows where we are and which trends are the most important for the near future.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The mobile networks of earlier and current generations, or 2G and 3G networks, provide users voice and packet services with higher transmission rates and good quality over the same core network. When developing the next generation of mobile networks the current quality of services needs to be maintained. This thesis concentrates on the next generation mobile network, especially on the evolution of the packet network part. The new mobile network has requirements for the common packet backbone network, Mobile Packet Backbone Network, which is additionally discussed in this study. The next generation mobile network, called LTE/SAE, is currently under testing. The test system is called Container Trial System. It is a mini sized LTE/SAE site. The LTE/SAE is studied in this thesis concentrating on the evolved packet core, the SAE part of the composition. The empirical part of the study compares the LTE/SAE Container Trial System and commercial network designs and additionally produces documentation for internal personnel and customers. The research is performed by comparing the documentations and specifications of both the Container Trial System and commercial network. Since the LTE commercial network is not yet constructed, the comparison is done theoretically. The purpose is furthermore to find out if there are any design issues that could be done differently in the next version of the Container Trial System.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The importance of after-sales service or service in general can be seen and experienced by customers every day with industrial as well as other non-industrial services or products. This dissertation, drawing on theory and experience, focuses on practical engineering implications, specifically the management of customer issues in the after-sales phase in the mobile phone arena. The main objective of this doctoral dissertation is to investigate customer after-sales issue management, specifically regarding mobile phones. The case studies focus on issue resolution time and the issue of corrective actions. This dissertation consists of a main body and four peer-reviewed journal articles and one manuscript currently under review by a peer-reviewed journal. The main body of this dissertation examines the elements of customer satisfaction, loyalty, and retention with respect to corrective actions to address customer issues and issue resolution time through literature and empirical studies. The five independent works are case studies supporting the thesis research questions. This study examines four questions: 1) What are the factors affecting corrective actions for customers? 2) How can customer issue resolution time be controlled? 3) What are the factors affecting processes in the service chain? and 4) How can communication be measured in a service chain? In this work, both quantitative and qualitative analysis methods are used. The main body of the thesis reviews the literature regarding the elements that bridge the five case studies. The case studies of the articles and surveys lean more toward the methodology of critical positivism and then apply the interpretive approach in interpreting the results. The case study articles employ various statistical methods to analyze and to interpret the empirical and survey data. The statistical methods were used to create a model that is useful for significantly optimizing issue resolution time. Moreover, it was found that samples for verifying issues provided by the customer neither improve the perceived quality of corrective actions nor the perceived quality of issue resolution time. The term “service” in this work is limited to the technical services that are provided by product manufacturers and after-sales authorized service vendors. On the basis of this research work, it has been observed that corrective actions and issue resolution time are associated with customer satisfaction and hence, according to induction theory, to customer loyalty and retention. This thesis utilizes knowledge of marketing and customer relationships to contribute to the existing body of knowledge concerning information and communication technology for after-sales service recovery of mobile terminals. The established models in the thesis contribute to the existing knowledge of the after-sales process of dealing with customer issues in the field of mobile phones. The findings suggest that process managers could focus more on communication and training provided to the staff as new technology evolves rapidly. The study also suggest the managers formulate strategies for how customers can be kept informed on a regular basis of the status of issues that have been escalated for corrective action. The findings also lay the foundation for the comprehensive objective to control the entire product development process, starting with conceptualization. This implies that robust design should be applied to the new products so that problems affecting customer service quality are not repeated. The objective will be achieved when the entire service chain from product development to the final user can be modeled and this model can be used to support the organization at all levels.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Internet today has become a vital part of day to day life, owing to the revolutionary changes it has brought about in various fields. Dependence on the Internet as an information highway and knowledge bank is exponentially increasing so that a going back is beyond imagination. Transfer of critical information is also being carried out through the Internet. This widespread use of the Internet coupled with the tremendous growth in e-commerce and m-commerce has created a vital need for infonnation security.Internet has also become an active field of crackers and intruders. The whole development in this area can become null and void if fool-proof security of the data is not ensured without a chance of being adulterated. It is, hence a challenge before the professional community to develop systems to ensure security of the data sent through the Internet.Stream ciphers, hash functions and message authentication codes play vital roles in providing security services like confidentiality, integrity and authentication of the data sent through the Internet. There are several ·such popular and dependable techniques, which have been in use widely, for quite a long time. This long term exposure makes them vulnerable to successful or near successful attempts for attacks. Hence it is the need of the hour to develop new algorithms with better security.Hence studies were conducted on various types of algorithms being used in this area. Focus was given to identify the properties imparting security at this stage. By making use of a perception derived from these studies, new algorithms were designed. Performances of these algorithms were then studied followed by necessary modifications to yield an improved system consisting of a new stream cipher algorithm MAJE4, a new hash code JERIM- 320 and a new message authentication code MACJER-320. Detailed analysis and comparison with the existing popular schemes were also carried out to establish the security levels.The Secure Socket Layer (SSL) I Transport Layer Security (TLS) protocol is one of the most widely used security protocols in Internet. The cryptographic algorithms RC4 and HMAC have been in use for achieving security services like confidentiality and authentication in the SSL I TLS. But recent attacks on RC4 and HMAC have raised questions about the reliability of these algorithms. Hence MAJE4 and MACJER-320 have been proposed as substitutes for them. Detailed studies on the performance of these new algorithms were carried out; it has been observed that they are dependable alternatives.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In the recent years, the unpredictable growth of the Internet has moreover pointed out the congestion problem, one of the problems that historicallyha ve affected the network. This paper deals with the design and the evaluation of a congestion control algorithm which adopts a FuzzyCon troller. The analogyb etween Proportional Integral (PI) regulators and Fuzzycon trollers is discussed and a method to determine the scaling factors of the Fuzzycon troller is presented. It is shown that the Fuzzycon troller outperforms the PI under traffic conditions which are different from those related to the operating point considered in the design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Building services are worth about 2% GDP and are essential for the effective and efficient operations of the building. It is increasingly recognised that the value of a building is related to the way it supports the client organisation’s ongoing business operations. Building services are central to the functional performance of buildings and provide the necessary conditions for health, well-being, safety and security of the occupants. They frequently comprise several technologically distinct sub-systems and their design and construction requires the involvement of numerous disciplines and trades. Designers and contractors working on the same project are frequently employed by different companies. Materials and equipment is supplied by a diverse range of manufacturers. Facilities managers are responsible for operation of the building service in use. The coordination between these participants is crucially important to achieve optimum performance, but too often is neglected. This leaves room for serious faults. The need for effective integration is important. Modern technology offers increasing opportunities for integrated personal-control systems for lighting, ventilation and security as well as interoperability between systems. Opportunities for a new mode of systems integration are provided by the emergence of PFI/PPP procurements frameworks. This paper attempts to establish how systems integration can be achieved in the process of designing, constructing and operating building services. The essence of the paper therefore is to envisage the emergent organisational responses to the realisation of building services as an interactive systems network.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A construction algorithm for multioutput radial basis function (RBF) network modelling is introduced by combining a locally regularised orthogonal least squares (LROLS) model selection with a D-optimality experimental design. The proposed algorithm aims to achieve maximised model robustness and sparsity via two effective and complementary approaches. The LROLS method alone is capable of producing a very parsimonious RBF network model with excellent generalisation performance. The D-optimality design criterion enhances the model efficiency and robustness. A further advantage of the combined approach is that the user only needs to specify a weighting for the D-optimality cost in the combined RBF model selecting criterion and the entire model construction procedure becomes automatic. The value of this weighting does not influence the model selection procedure critically and it can be chosen with ease from a wide range of values.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper introduces a new neurofuzzy model construction and parameter estimation algorithm from observed finite data sets, based on a Takagi and Sugeno (T-S) inference mechanism and a new extended Gram-Schmidt orthogonal decomposition algorithm, for the modeling of a priori unknown dynamical systems in the form of a set of fuzzy rules. The first contribution of the paper is the introduction of a one to one mapping between a fuzzy rule-base and a model matrix feature subspace using the T-S inference mechanism. This link enables the numerical properties associated with a rule-based matrix subspace, the relationships amongst these matrix subspaces, and the correlation between the output vector and a rule-base matrix subspace, to be investigated and extracted as rule-based knowledge to enhance model transparency. The matrix subspace spanned by a fuzzy rule is initially derived as the input regression matrix multiplied by a weighting matrix that consists of the corresponding fuzzy membership functions over the training data set. Model transparency is explored by the derivation of an equivalence between an A-optimality experimental design criterion of the weighting matrix and the average model output sensitivity to the fuzzy rule, so that rule-bases can be effectively measured by their identifiability via the A-optimality experimental design criterion. The A-optimality experimental design criterion of the weighting matrices of fuzzy rules is used to construct an initial model rule-base. An extended Gram-Schmidt algorithm is then developed to estimate the parameter vector for each rule. This new algorithm decomposes the model rule-bases via an orthogonal subspace decomposition approach, so as to enhance model transparency with the capability of interpreting the derived rule-base energy level. This new approach is computationally simpler than the conventional Gram-Schmidt algorithm for resolving high dimensional regression problems, whereby it is computationally desirable to decompose complex models into a few submodels rather than a single model with large number of input variables and the associated curse of dimensionality problem. Numerical examples are included to demonstrate the effectiveness of the proposed new algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This contribution introduces a new digital predistorter to compensate serious distortions caused by memory high power amplifiers (HPAs) which exhibit output saturation characteristics. The proposed design is based on direct learning using a data-driven B-spline Wiener system modeling approach. The nonlinear HPA with memory is first identified based on the B-spline neural network model using the Gauss-Newton algorithm, which incorporates the efficient De Boor algorithm with both B-spline curve and first derivative recursions. The estimated Wiener HPA model is then used to design the Hammerstein predistorter. In particular, the inverse of the amplitude distortion of the HPA's static nonlinearity can be calculated effectively using the Newton-Raphson formula based on the inverse of De Boor algorithm. A major advantage of this approach is that both the Wiener HPA identification and the Hammerstein predistorter inverse can be achieved very efficiently and accurately. Simulation results obtained are presented to demonstrate the effectiveness of this novel digital predistorter design.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The pipe sizing of water networks via evolutionary algorithms is of great interest because it allows the selection of alternative economical solutions that meet a set of design requirements. However, available evolutionary methods are numerous, and methodologies to compare the performance of these methods beyond obtaining a minimal solution for a given problem are currently lacking. A methodology to compare algorithms based on an efficiency rate (E) is presented here and applied to the pipe-sizing problem of four medium-sized benchmark networks (Hanoi, New York Tunnel, GoYang and R-9 Joao Pessoa). E numerically determines the performance of a given algorithm while also considering the quality of the obtained solution and the required computational effort. From the wide range of available evolutionary algorithms, four algorithms were selected to implement the methodology: a PseudoGenetic Algorithm (PGA), Particle Swarm Optimization (PSO), a Harmony Search and a modified Shuffled Frog Leaping Algorithm (SFLA). After more than 500,000 simulations, a statistical analysis was performed based on the specific parameters each algorithm requires to operate, and finally, E was analyzed for each network and algorithm. The efficiency measure indicated that PGA is the most efficient algorithm for problems of greater complexity and that HS is the most efficient algorithm for less complex problems. However, the main contribution of this work is that the proposed efficiency ratio provides a neutral strategy to compare optimization algorithms and may be useful in the future to select the most appropriate algorithm for different types of optimization problems.