900 resultados para Precision timed machines


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The problem of computing the storage capacity of a feed-forward network, with L hidden layers, N inputs, and K units in the first hidden layer, is analyzed using techniques from statistical mechanics. We found that the storage capacity strongly depends on the network architecture αc ∼ (log K)1-1/2L and that the number of units K limits the number of possible hidden layers L through the relationship 2L - 1 < 2log K. © 2014 IOP Publishing Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, learning word vector representations has attracted much interest in Natural Language Processing. Word representations or embeddings learned using unsupervised methods help addressing the problem of traditional bag-of-word approaches which fail to capture contextual semantics. In this paper we go beyond the vector representations at the word level and propose a novel framework that learns higher-level feature representations of n-grams, phrases and sentences using a deep neural network built from stacked Convolutional Restricted Boltzmann Machines (CRBMs). These representations have been shown to map syntactically and semantically related n-grams to closeby locations in the hidden feature space. We have experimented to additionally incorporate these higher-level features into supervised classifier training for two sentiment analysis tasks: subjectivity classification and sentiment classification. Our results have demonstrated the success of our proposed framework with 4% improvement in accuracy observed for subjectivity classification and improved the results achieved for sentiment classification over models trained without our higher level features.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Emulsions and microcapsules are typical structures in various dispersion formulations for pharmaceutical, food, personal and house care applications. Precise control over size and size distribution of emulsion droplets and microcapsules are important for effective use and delivery of active components and better product quality. Many emulsification technologies have been developed to meet different formulation and processing requirements. Among them, membrane and microfluidic emulsification as emerging technologies have the feature of being able to precisely manufacture droplets in a drop-by-drop manner to give subscribed sizes and size distributions with lower energy consumption. This paper reviews fundamental sciences and engineering aspects of emulsification, membrane and microfluidic emulsification technologies and their use for precision manufacture of emulsions for intensified processing. Generic application examples are given for single and double emulsions and microcapsules with different structure features. © 2013 The Society of Powder Technology Japan. Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A general technique for transforming a timed finite state automaton into an equivalent automated planning domain based on a numerical parameter model is introduced. Timed transition automata have many applications in control systems and agents models; they are used to describe sequential processes, where actions are labelling by automaton transitions subject to temporal constraints. The language of timed words accepted by a timed automaton, the possible sequences of system or agent behaviour, can be described in term of an appropriate planning domain encapsulating the timed actions patterns and constraints. The time words recognition problem is then posed as a planning problem where the goal is to reach a final state by a sequence of actions, which corresponds to the timed symbols labeling the automaton transitions. The transformation is proved to be correct and complete and it is space/time linear on the automaton size. Experimental results shows that the performance of the planning domain obtained by transformation is scalable for real world applications. A major advantage of the planning based approach, beside of the solving the parsing problem, is to represent in a single automated reasoning framework problems of plan recognitions, plan synthesis and plan optimisation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When machining a large-scale aerospace part, the part is normally located and clamped firmly until a set of features are machined. When the part is released, its size and shape may deform beyond the tolerance limits due to stress release. This paper presents the design of a new fixing method and flexible fixtures that would automatically respond to workpiece deformation during machining. Deformation is inspected and monitored on-line, and part location and orientation can be adjusted timely to ensure follow-up operations are carried out under low stress and with respect to the related datum defined in the design models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring and compensating the pivot points of five-axis machine tools is always challenging and very time consuming. This paper presents a newly developed approach for automatic measurement and compensation of pivot point positional errors on five-axis machine tools. Machine rotary axis errors are measured using a circular test. This method has been tested on five-axis machine tools with swivel table configuration. Results show that up to 99% of the positional errors of the rotary axis can be compensated by using this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper draws upon part of the findings of an ethnographic study in which two seventeen year old girls were employed to interview their peer about engineering as a study and career choice. It argues that whilst girls do view engineering as being generally masculine in nature, other factors such as a lack of female role models and an emphasis on physics and maths act as barriers to young women entering the discipline. The paper concludes by noting that engineering has much to offer young women, the problem is, they simply don't know this is the case! Copyright © 2013 Jane Andrews & Robin Clark.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We propose a long range, high precision optical time domain reflectometry (OTDR) based on an all-fiber supercontinuum source. The source simply consists of a CW pump laser with moderate power and a section of fiber, which has a zero dispersion wavelength near the laser's central wavelength. Spectrum and time domain properties of the source are investigated, showing that the source has great capability in nonlinear optics, such as correlation OTDR due to its ultra-wide-band chaotic behavior, and mm-scale spatial resolution is demonstrated. Then we analyze the key factors limiting the operational range of such an OTDR, e. g., integral Rayleigh backscattering and the fiber loss, which degrades the optical signal to noise ratio at the receiver side, and then the guideline for counter-act such signal fading is discussed. Finally, we experimentally demonstrate a correlation OTDR with 100km sensing range and 8.2cm spatial resolution (1.2 million resolved points), as a verification of theoretical analysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The fabrication precision is one of the most critical challenges to the creation of practical photonic circuits composed of coupled high Q-factor microresonators. While very accurate transient tuning of microresonators based on local heating has been reported, the record precision of permanent resonance positioning achieved by post-processing is still within 1 and 5 GHz. Here we demonstrate two coupled bottle microresonators fabricated at the fiber surface with resonances that are matched with a better than 0.16 GHz precision. This corresponds to a better than 0.17 Å precision in the effective fiber radius variation. The achieved fabrication precision is only limited by the resolution of our optical spectrum analyzer and can be potentially improved by an order of magnitude.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In our study we rely on a data mining procedure known as support vector machine (SVM) on the database of the first Hungarian bankruptcy model. The models constructed are then contrasted with the results of earlier bankruptcy models with the use of classification accuracy and the area under the ROC curve. In using the SVM technique, in addition to conventional kernel functions, we also examine the possibilities of applying the ANOVA kernel function and take a detailed look at data preparation tasks recommended in using the SVM method (handling of outliers). The results of the models assembled suggest that a significant improvement of classification accuracy can be achieved on the database of the first Hungarian bankruptcy model when using the SVM method as opposed to neural networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parameter design is an experimental design and analysis methodology for developing robust processes and products. Robustness implies insensitivity to noise disturbances. Subtle experimental realities, such as the joint effect of process knowledge and analysis methodology, may affect the effectiveness of parameter design in precision engineering; where the objective is to detect minute variation in product and process performance. In this thesis, approaches to statistical forced-noise design and analysis methodologies were investigated with respect to detecting performance variations. Given a low degree of process knowledge, Taguchi's methodology of signal-to-noise ratio analysis was found to be more suitable in detecting minute performance variations than the classical approach based on polynomial decomposition. Comparison of inner-array noise (IAN) and outer-array noise (OAN) structuring approaches showed that OAN is a more efficient design for precision engineering. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Intoxilyzer 5000 was tested for calibration curve linearity for ethanol vapor concentration between 0.020 and 0.400g/210L with excellent linearity. Calibration error using reference solutions outside of the allowed concentration range, response to the same ethanol reference solution at different temperatures between 34 and 38$\sp\circ$C, and its response to eleven chemicals, 10 mixtures of two at the time, and one mixture of four chemicals potentially found in human breath have been evaluated. Potential interferents were chosen on the basis of their infrared signatures and the concentration range of solutions corresponding to the non-lethal blood concentration range of various volatile organic compounds reported in the literature. The result of this study indicates that the instrument calibrates with solutions outside the allowed range up to $\pm$10% of target value. Headspace FID dual column GC analysis was used to confirm the concentrations of the solutions. Increasing the temperature of the reference solution from 34 to 38$\sp\circ$C resulted in linear increases in instrument recorded ethanol readings with an average increase of 6.25%/$\sp\circ$C. Of the eleven chemicals studied during this experiment, six, isopropanol, toluene, methyl ethyl ketone, trichloroethylene, acetaldehyde, and methanol could reasonably interfere with the test at non-lethal reported blood concentration ranges, the mixtures of those six chemicals showed linear additive results with a combined effect of as much as a 0.080g/210L reading (Florida's legal limit) without any ethanol present. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Historically, memory has been evaluated by examining how much is remembered, however a more recent conception of memory focuses on the accuracy of memories. When using this accuracy-oriented conception of memory, unlike with the quantity-oriented approach, memory does not always deteriorate over time. A possible explanation for this seemingly surprising finding lies in the metacognitive processes of monitoring and control. Use of these processes allows people to withhold responses of which they are unsure, or to adjust the precision of responses to a level that is broad enough to be correct. The ability to accurately report memories has implications for investigators who interview witnesses to crimes, and those who evaluate witness testimony. ^ This research examined the amount of information provided, accuracy, and precision of responses provided during immediate and delayed interviews about a videotaped mock crime. The interview format was manipulated such that a single free narrative response was elicited, or a series of either yes/no or cued questions were asked. Instructions provided by the interviewer indicated to the participants that they should either stress being informative, or being accurate. The interviews were then transcribed and scored. ^ Results indicate that accuracy rates remained stable and high after a one week delay. Compared to those interviewed immediately, after a delay participants provided less information and responses that were less precise. Participants in the free narrative condition were the most accurate. Participants in the cued questions condition provided the most precise responses. Participants in the yes/no questions condition were most likely to say “I don’t know”. The results indicate that people are able to monitor their memories and modify their reports to maintain high accuracy. When control over precision was not possible, such as in the yes/no condition, people said “I don’t know” to maintain accuracy. However when withholding responses and adjusting precision were both possible, people utilized both methods. It seems that concerns that memories reported after a long retention interval might be inaccurate are unfounded. ^

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This research is motivated by a practical application observed at a printed circuit board (PCB) manufacturing facility. After assembly, the PCBs (or jobs) are tested in environmental stress screening (ESS) chambers (or batch processing machines) to detect early failures. Several PCBs can be simultaneously tested as long as the total size of all the PCBs in the batch does not violate the chamber capacity. PCBs from different production lines arrive dynamically to a queue in front of a set of identical ESS chambers, where they are grouped into batches for testing. Each line delivers PCBs that vary in size and require different testing (or processing) times. Once a batch is formed, its processing time is the longest processing time among the PCBs in the batch, and its ready time is given by the PCB arriving last to the batch. ESS chambers are expensive and a bottleneck. Consequently, its makespan has to be minimized. ^ A mixed-integer formulation is proposed for the problem under study and compared to a formulation recently published. The proposed formulation is better in terms of the number of decision variables, linear constraints and run time. A procedure to compute the lower bound is proposed. For sparse problems (i.e. when job ready times are dispersed widely), the lower bounds are close to optimum. ^ The problem under study is NP-hard. Consequently, five heuristics, two metaheuristics (i.e. simulated annealing (SA) and greedy randomized adaptive search procedure (GRASP)), and a decomposition approach (i.e. column generation) are proposed—especially to solve problem instances which require prohibitively long run times when a commercial solver is used. Extensive experimental study was conducted to evaluate the different solution approaches based on the solution quality and run time. ^ The decomposition approach improved the lower bounds (or linear relaxation solution) of the mixed-integer formulation. At least one of the proposed heuristic outperforms the Modified Delay heuristic from the literature. For sparse problems, almost all the heuristics report a solution close to optimum. GRASP outperforms SA at a higher computational cost. The proposed approaches are viable to implement as the run time is very short. ^