826 resultados para Effects-Based Approach to Operations
Resumo:
The current approach to data analysis for the Laser Interferometry Space Antenna (LISA) depends on the time delay interferometry observables (TDI) which have to be generated before any weak signal detection can be performed. These are linear combinations of the raw data with appropriate time shifts that lead to the cancellation of the laser frequency noises. This is possible because of the multiple occurrences of the same noises in the different raw data. Originally, these observables were manually generated starting with LISA as a simple stationary array and then adjusted to incorporate the antenna's motions. However, none of the observables survived the flexing of the arms in that they did not lead to cancellation with the same structure. The principal component approach is another way of handling these noises that was presented by Romano and Woan which simplified the data analysis by removing the need to create them before the analysis. This method also depends on the multiple occurrences of the same noises but, instead of using them for cancellation, it takes advantage of the correlations that they produce between the different readings. These correlations can be expressed in a noise (data) covariance matrix which occurs in the Bayesian likelihood function when the noises are assumed be Gaussian. Romano and Woan showed that performing an eigendecomposition of this matrix produced two distinct sets of eigenvalues that can be distinguished by the absence of laser frequency noise from one set. The transformation of the raw data using the corresponding eigenvectors also produced data that was free from the laser frequency noises. This result led to the idea that the principal components may actually be time delay interferometry observables since they produced the same outcome, that is, data that are free from laser frequency noise. The aims here were (i) to investigate the connection between the principal components and these observables, (ii) to prove that the data analysis using them is equivalent to that using the traditional observables and (ii) to determine how this method adapts to real LISA especially the flexing of the antenna. For testing the connection between the principal components and the TDI observables a 10x 10 covariance matrix containing integer values was used in order to obtain an algebraic solution for the eigendecomposition. The matrix was generated using fixed unequal arm lengths and stationary noises with equal variances for each noise type. Results confirm that all four Sagnac observables can be generated from the eigenvectors of the principal components. The observables obtained from this method however, are tied to the length of the data and are not general expressions like the traditional observables, for example, the Sagnac observables for two different time stamps were generated from different sets of eigenvectors. It was also possible to generate the frequency domain optimal AET observables from the principal components obtained from the power spectral density matrix. These results indicate that this method is another way of producing the observables therefore analysis using principal components should give the same results as that using the traditional observables. This was proven by fact that the same relative likelihoods (within 0.3%) were obtained from the Bayesian estimates of the signal amplitude of a simple sinusoidal gravitational wave using the principal components and the optimal AET observables. This method fails if the eigenvalues that are free from laser frequency noises are not generated. These are obtained from the covariance matrix and the properties of LISA that are required for its computation are the phase-locking, arm lengths and noise variances. Preliminary results of the effects of these properties on the principal components indicate that only the absence of phase-locking prevented their production. The flexing of the antenna results in time varying arm lengths which will appear in the covariance matrix and, from our toy model investigations, this did not prevent the occurrence of the principal components. The difficulty with flexing, and also non-stationary noises, is that the Toeplitz structure of the matrix will be destroyed which will affect any computation methods that take advantage of this structure. In terms of separating the two sets of data for the analysis, this was not necessary because the laser frequency noises are very large compared to the photodetector noises which resulted in a significant reduction in the data containing them after the matrix inversion. In the frequency domain the power spectral density matrices were block diagonals which simplified the computation of the eigenvalues by allowing them to be done separately for each block. The results in general showed a lack of principal components in the absence of phase-locking except for the zero bin. The major difference with the power spectral density matrix is that the time varying arm lengths and non-stationarity do not show up because of the summation in the Fourier transform.
Resumo:
The purpose of this paper is to propose a multiobjective optimization approach for solving the manufacturing cell formation problem, explicitly considering the performance of this said manufacturing system. Cells are formed so as to simultaneously minimize three conflicting objectives, namely, the level of the work-in-process, the intercell moves and the total machinery investment. A genetic algorithm performs a search in the design space, in order to approximate to the Pareto optimal set. The values of the objectives for each candidate solution in a population are assigned by running a discrete-event simulation, in which the model is automatically generated according to the number of machines and their distribution among cells implied by a particular solution. The potential of this approach is evaluated via its application to an illustrative example, and a case from the relevant literature. The obtained results are analyzed and reviewed. Therefore, it is concluded that this approach is capable of generating a set of alternative manufacturing cell configurations considering the optimization of multiple performance measures, greatly improving the decision making process involved in planning and designing cellular systems. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.
Resumo:
The Building Partnerships Program at the University of Queensland, Australia seeks to address the dual challenge of preparing doctors who are responsive to the community while providing a meaningful context for social sciences learning. Through partnerships with a diverse range of community agencies, the program offers students opportunities to gain non-clinical perspectives on health and illness through structured learning activities including: family visits; community agency visits and attachments; and interview training. Students learn first-hand about psychosocial influences on health and how people manage health problems on a day-to-day basis. They also gain insights into the work of community agencies and how they as future doctors might work in partnership with them to enhance patient care. We outline the main components of the program, identify challenges and successes from student and community agency perspectives, and consider areas that invite further development.
Resumo:
The binary diffusivities of water in low molecular weight sugars; fructose, sucrose and a high molecular weight carbohydrate; maltodextrin (DE 11) and the effective diffusivities of water in mixtures of these sugars (sucrose, glucose, fructose) and maltodextrin (DE 11) were determined using a simplified procedure based on the Regular Regime Approach. The effective diffusivity of these mixtures exhibited both the concentration and molecular weight dependence. Surface stickiness was observed in all samples during desorption, with fructose exhibiting the highest and maltodextrin the lowest. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
The main purpose of this research is to identify the hidden knowledge and learning mechanisms in the organization in order to disclosure the tacit knowledge and transform it into explicit knowledge. Most firms usually tend to duplicate their efforts acquiring extra knowledge and new learning skills while forgetting to exploit the existing ones thus wasting one life time resources that could be applied to increase added value within the firm overall competitive advantage. This unique value in the shape of creation, acquisition, transformation and application of learning and knowledge is not disseminated throughout the individual, group and, ultimately, the company itself. This work is based on three variables that explain the behaviour of learning as the process of construction and acquisition of knowledge, namely internal social capital, technology and external social capital, which include the main attributes of learning and knowledge that help us to capture the essence of this symbiosis. Absorptive Capacity provides the right tool to explore this uncertainty within the firm it is possible to achieve the perfect match between learning skills and knowledge needed to support the overall strategy of the firm. This study has taken in to account a sample of the Portuguese textile industry and it is based on a multisectorial analysis that makes it possible a crossfunctional analysis to check on the validity of results in order to better understand and capture the dynamics of organizational behavior.
Resumo:
The main purpose of this research is to identify the hidden knowledge and learning mechanisms in the organization in order to disclosure the tacit knowledge and transform it into explicit knowledge. Most firms usually tend to duplicate their efforts acquiring extra knowledge and new learning skills while forgetting to exploit the existing ones thus wasting one life time resources that could be applied to increase added value within the firm overall competitive advantage. This unique value in the shape of creation, acquisition, transformation and application of learning and knowledge is not disseminated throughout the individual, group and, ultimately, the company itself. This work is based on three variables that explain the behaviour of learning as the process of construction and acquisition of knowledge, namely internal social capital, technology and external social capital, which include the main attributes of learning and knowledge that help us to capture the essence of this symbiosis. Absorptive Capacity provides the right tool to explore this uncertainty within the firm it is possible to achieve the perfect match between learning skills and knowledge needed to support the overall strategy of the firm. This study has taken in to account a sample of the Portuguese textile industry and it is based on a multisectorial analysis that makes it possible a crossfunctional analysis to check on the validity of results in order to better understand and capture the dynamics of organizational behavior.
Resumo:
The spread and globalization of distributed generation (DG) in recent years has should highly influence the changes that occur in Electricity Markets (EMs). DG has brought a large number of new players to take action in the EMs, therefore increasing the complexity of these markets. Simulation based on multi-agent systems appears as a good way of analyzing players’ behavior and interactions, especially in a coalition perspective, and the effects these players have on the markets. MASCEM – Multi-Agent System for Competitive Electricity Markets was created to permit the study of the market operation with several different players and market mechanisms. MASGriP – Multi-Agent Smart Grid Platform is being developed to facilitate the simulation of micro grid (MG) and smart grid (SG) concepts with multiple different scenarios. This paper presents an intelligent management method for MG and SG. The simulation of different methods of control provides an advantage in comparing different possible approaches to respond to market events. Players utilize electric vehicles’ batteries and participate in Demand Response (DR) contracts, taking advantage on the best opportunities brought by the use of all resources, to improve their actions in response to MG and/or SG requests.
Resumo:
The present paper results of an ongoing research project were it is expected to develop an information system to monitoring a cultural-touristic route. The route to monitor is the Romanesque Route of Tâmega. This Route is composed of 58 monuments located in the region of Tâmega in the North of Portugal. Due to the particular location of this region, that is between coastal zone, but not yet in the inland, it has a weak political influence, and it is reflected in the low levels of development at several levels, observed. The Romanesque Route was implemented in a part of this region in 1998, and enlarged to the all-region in 2010. In order to evaluate the socio-ecomonic impact of this route in the region a research project is being developed. The main goal of this paper is to open a discussion on the elements that must be taken into consideration to evaluate the economic and social impact of a touristic cultural route within a region and this one in particular.
Resumo:
Forestry in general and logging in particular continue to be among the three most hazardous sectors in European countries. The aim of this article is to characterize health and safety problems and solutions in E.U. forestry operations, and particularly in Portuguese operations. Forest types, production, employment and ownership are used to characterize the forest sector. Forestry accidents and health problems data are mentioned. Typical hazards associated to the nature of logging operations are systematized. Preventive measures, focused on a wide spectrum of measures, making safety considerations an integral feature of all operational activities from planning to organization to execution and supervision of work are emphasized in this article.
Resumo:
Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores
Resumo:
The MAP-i Doctoral Programme in Informatics, of the Universities of Minho, Aveiro and Porto
Resumo:
Tuberculosis presents a myriad of symptoms, progression routes and propagation patterns not yet fully understood. Whereas for a long time research has focused solely on the patient immunity and overall susceptibility, it is nowadays widely accepted that the genetic diversity of its causative agent, Mycobacterium tuberculosis, plays a key role in this dynamic. This study focuses on a particular family of genes, the mclxs (Mycobacterium cyclase/LuxR-like genes), which codify for a particular and nearly mycobacterial-exclusive combination of protein domains. mclxs genes were found to be pseudogenized by frameshift-causing insertion(s)/deletion(s) in a considerable number of M. tuberculosis complex strains and clinical isolates. To discern the functional implications of the pseudogenization, we have analysed the pattern of frameshift-causing mutations in a group of M. tuberculosis isolates while taking into account their microbial-, patient- and disease-related traits. Our logistic regression-based analyses have revealed disparate effects associated with the transcriptional inactivation of two mclx genes. In fact, mclx2 (Rv1358) pseudogenization appears to be primarily driven by the microbial phylogenetic background, being mainly related to the Euro-American (EAm) lineage; on the other hand, mclx3 (Rv2488c) presents a higher tendency for pseudogenization among isolates from patients born on the Western Pacific area, and from isolates causing extra-pulmonary infections. These results contribute to the overall knowledge on the biology of M. tuberculosis infection, whereas at the same time launch the necessary basis for the functional assessment of these so far overlooked genes.