937 resultados para high-order reasoning


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper provides evidence of a turn of the year effect in the order flow imbalance of both retail and institutional investors. In December there is net selling pressure which is reversed in January. We examine high frequency intraday order flow information and find that the changes in order flow imbalance between December and January are related to firm risk factors and characteristics. We find that retail order flow imbalances are associated with a wide range of risk characteristics including beta, illiquidity and unsystematic risk. Imbalances in institutional order flow are associated with only a small number of risk variables. We show that these order flow changes are important because risk premiums are elevated in January. Our results are robust to the effects of decimalization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has never been easy for manufacturing companies to understand their confidence level in terms of how accurate and to what degree of flexibility parts can be made. This brings uncertainty in finding the most suitable manufacturing method as well as in controlling their product and process verification systems. The aim of this research is to develop a system for capturing the company’s knowledge and expertise and then reflect it into an MRP (Manufacturing Resource Planning) system. A key activity here is measuring manufacturing and machining capabilities to a reasonable confidence level. For this purpose an in-line control measurement system is introduced to the company. Using SPC (Statistical Process Control) not only helps to predict the trend in manufacturing of parts but also minimises the human error in measurement. Gauge R&R (Repeatability and Reproducibility) study identifies problems in measurement systems. Measurement is like any other process in terms of variability. Reducing this variation via an automated machine probing system helps to avoid defects in future products.Developments in aerospace, nuclear, oil and gas industries demand materials with high performance and high temperature resistance under corrosive and oxidising environments. Superalloys were developed in the latter half of the 20th century as high strength materials for such purposes. For the same characteristics superalloys are considered as difficult-to-cut alloys when it comes to formation and machining. Furthermore due to the sensitivity of superalloy applications, in many cases they should be manufactured with tight tolerances. In addition superalloys, specifically Nickel based, have unique features such as low thermal conductivity due to having a high amount of Nickel in their material composition. This causes a high surface temperature on the work-piece at the machining stage which leads to deformation in the final product.Like every process, the material variations have a significant impact on machining quality. The main cause of variations can originate from chemical composition and mechanical hardness. The non-uniform distribution of metal elements is a major source of variation in metallurgical structures. Different heat treatment standards are designed for processing the material to the desired hardness levels based on application. In order to take corrective actions, a study on the material aspects of superalloys has been conducted. In this study samples from different batches of material have been analysed. This involved material preparation for microscopy analysis, and the effect of chemical compositions on hardness (before and after heat treatment). Some of the results are discussed and presented in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel algorithm for medial surfaces extraction that is based on the density-corrected Hamiltonian analysis of Torsello and Hancock [1]. In order to cope with the exponential growth of the number of voxels, we compute a first coarse discretization of the mesh which is iteratively refined until a desired resolution is achieved. The refinement criterion relies on the analysis of the momentum field, where only the voxels with a suitable value of the divergence are exploded to a lower level of the hierarchy. In order to compensate for the discretization errors incurred at the coarser levels, a dilation procedure is added at the end of each iteration. Finally we design a simple alignment procedure to correct the displacement of the extracted skeleton with respect to the true underlying medial surface. We evaluate the proposed approach with an extensive series of qualitative and quantitative experiments. © 2013 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Semantic Web has come a long way since its inception in 2001, especially in terms of technical development and research progress. However, adoption by non- technical practitioners is still an ongoing process, and in some areas this process is just now starting. Emergency response is an area where reliability and timeliness of information and technologies is of essence. Therefore it is quite natural that more widespread adoption in this area has not been seen until now, when Semantic Web technologies are mature enough to support the high requirements of the application area. Nevertheless, to leverage the full potential of Semantic Web research results for this application area, there is need for an arena where practitioners and researchers can meet and exchange ideas and results. Our intention is for this workshop, and hopefully coming workshops in the same series, to be such an arena for discussion. The Extended Semantic Web Conference (ESWC - formerly the European Semantic Web conference) is one of the major research conferences in the Semantic Web field, whereas this is a suitable location for this workshop in order to discuss the application of Semantic Web technology to our specific area of applications. Hence, we chose to arrange our first SMILE workshop at ESWC 2013. However, this workshop does not focus solely on semantic technologies for emergency response, but rather Semantic Web technologies in combination with technologies and principles for what is sometimes called the "social web". Social media has already been used successfully in many cases, as a tool for supporting emergency response. The aim of this workshop is therefore to take this to the next level and answer questions like: "how can we make sense of, and furthermore make use of, all the data that is produced by different kinds of social media platforms in an emergency situation?" For the first edition of this workshop the chairs collected the following main topics of interest: • Semantic Annotation for understanding the content and context of social media streams. • Integration of Social Media with Linked Data. • Interactive Interfaces and visual analytics methodologies for managing multiple large-scale, dynamic, evolving datasets. • Stream reasoning and event detection. • Social Data Mining. • Collaborative tools and services for Citizens, Organisations, Communities. • Privacy, ethics, trustworthiness and legal issues in the Social Semantic Web. • Use case analysis, with specific interest for use cases that involve the application of Social Media and Linked Data methodologies in real-life scenarios. All of these, applied in the context of: • Crisis and Disaster Management • Emergency Response • Security and Citizen Journalism The workshop received 6 high-quality paper submissions and based on a thorough review process, thanks to our program committee, the decision was made to accept four of these papers for the workshop (67% acceptance rate). These four papers can be found later in this proceedings volume. Three out of four of these papers particularly discuss the integration and analysis of social media data, using Semantic Web technologies, e.g. for detecting complex events in social media streams, for visualizing and analysing sentiments with respect to certain topics in social media, or for detecting small-scale incidents entirely through the use of social media information. Finally, the fourth paper presents an architecture for using Semantic Web technologies in resource management during a disaster. Additionally, the workshop featured an invited keynote speech by Dr. Tomi Kauppinen from Aalto university. Dr. Kauppinen shared experiences from his work on applying Semantic Web technologies to application fields such as geoinformatics and scientific research, i.e. so-called Linked Science, but also recent ideas and applications in the emergency response field. His input was also highly valuable for the roadmapping discussion, which was held at the end of the workshop. A separate summary of the roadmapping session can be found at the end of these proceedings. Finally, we would like to thank our invited speaker Dr. Tomi Kauppinen, all our program committee members, as well as the workshop chair of ESWC2013, Johanna Völker (University of Mannheim), for helping us to make this first SMILE workshop a highly interesting and successful event!

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we study for the first time the influence of microwave power higher than 2.0 kW on bonded hydrogen impurity incorporation (form and content) in nanocrystalline diamond (NCD) films grown in a 5 kW MPCVD reactor. The NCD samples of different thickness ranging from 25 to 205 μm were obtained through a small amount of simultaneous nitrogen and oxygen addition into conventional about 4% methane in hydrogen reactants by keeping the other operating parameters in the same range as that typically used for the growth of large-grained polycrystalline diamond films. Specific hydrogen point defect in the NCD films is analyzed by using Fourier-transform infrared (FTIR) spectroscopy. When the other operating parameters are kept constant (mainly the input gases), with increasing of microwave power from 2.0 to 3.2 kW (the pressure was increased slightly in order to stabilize the plasma ball of the same size), which simultaneously resulting in the rise of substrate temperature more than 100 °C, the growth rate of the NCD films increases one order of magnitude from 0.3 to 3.0 μm/h, while the content of hydrogen impurity trapped in the NCD films during the growth process decreases with power. It has also been found that a new H related infrared absorption peak appears at 2834 cm-1 in the NCD films grown with a small amount of nitrogen and oxygen addition at power higher than 2.0 kW and increases with power higher than 3.0 kW. According to these new experimental results, the role of high microwave power on diamond growth and hydrogen impurity incorporation is discussed based on the standard growth mechanism of CVD diamonds using CH4/H2 gas mixtures. Our current experimental findings shed light into the incorporation mechanism of hydrogen impurity in NCD films grown with a small amount of nitrogen and oxygen addition into methane/hydrogen plasma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To test the practicality and effectiveness of cheap, ubiquitous, consumer-grade smartphones to discriminate Parkinson’s disease (PD) subjects from healthy controls, using self-administered tests of gait and postural sway. Background: Existing tests for the diagnosis of PD are based on subjective neurological examinations, performed in-clinic. Objective movement symptom severity data, collected using widely-accessible technologies such as smartphones, would enable the remote characterization of PD symptoms based on self-administered, behavioral tests. Smartphones, when backed up by interviews using web-based videoconferencing, could make it feasible for expert neurologists to perform diagnostic testing on large numbers of individuals at low cost. However, to date, the compliance rate of testing using smart-phones has not been assessed. Methods: We conducted a one-month controlled study with twenty participants, comprising 10 PD subjects and 10 controls. All participants were provided identical LG Optimus S smartphones, capable of recording tri-axial acceleration. Using these smartphones, patients conducted self-administered, short (less than 5 minute) controlled gait and postural sway tests. We analyzed a wide range of summary measures of gait and postural sway from the accelerometry data. Using statistical machine learning techniques, we identified discriminating patterns in the summary measures in order to distinguish PD subjects from controls. Results: Compliance was high all 20 participants performed an average of 3.1 tests per day for the duration of the study. Using this test data, we demonstrated cross-validated sensitivity of 98% and specificity of 98% in discriminating PD subjects from healthy controls. Conclusions: Using consumer-grade smartphone accelerometers, it is possible to distinguish PD from healthy controls with high accuracy. Since these smartphones are inexpensive (around $30 each) and easily available, and the tests are highly non-invasive and objective, we envisage that this kind of smartphone-based testing could radically increase the reach and effectiveness of experts in diagnosing PD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we report high growth rate of nanocrystalline diamond (NCD) films on silicon wafers of 2 inches in diameter using a new growth regime, which employs high power and CH4/H2/N2/O2 plasma using a 5 kW MPCVD system. This is distinct from the commonly used hydrogen-poor Ar/CH4 chemistries for NCD growth. Upon rising microwave power from 2000 W to 3200 W, the growth rate of the NCD films increases from 0.3 to 3.4 μm/h, namely one order of magnitude enhancement on the growth rate was achieved at high microwave power. The morphology, grain size, microstructure, orientation or texture, and crystalline quality of the NCD samples were characterized by scanning electron microscopy (SEM), atomic force microscopy (AFM), X-ray diffraction, and micro-Raman spectroscopy. The combined effect of nitrogen addition, microwave power, and temperature on NCD growth is discussed from the point view of gas phase chemistry and surface reactions. © 2011 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spray zone is an important region to control nucleation of granules in a high shear granulator. In this study, a spray zone with cross flow is quantified as a well-mixed compartment in a high shear granulator. Granulation kinetics is quantitatively derived at both particle-scale and spray zone-scale. Two spatial decay rates, DGSDR (droplet-granule spatial decay rate) ζDG and DPSDR (droplet-primary particle spatial decay rate) ζDP, which are functions of volume fraction and diameter of particulate species within the powder bed, are defined to simplify the deduction. It is concluded that in cross flow, explicit analytical results show that the droplet concentration is subject to exponential decay with depth which produces a numerically infinite depth of spray zone in a real penetration process. In a well-mixed spray zone, the depth of the spray zone is 4/(ζDG + ζDP) and π2/3(ζDG + ζDP) in cuboid and cylinder shape, respectively. The first-order droplet-based collision rates of, nucleation rate B0 and rewetting rate RW0 are uncorrelated with the flow pattern and shape of the spray zone. The second-order droplet-based collision rate, nucleated granule-granule collision rate RGG, is correlated with the mixing pattern. Finally, a real formulation case of a high shear granulation process is used to estimate the size of the spray zone. The results show that the spray zone is a thin layer at the powder bed surface. We present, for the first time, the spray zone as a well-mixed compartment. The granulation kinetics of a well-mixed spray zone could be integrated into a Population Balance Model (PBM), particularly to aid development of a distributed model for product quality prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-power and high-voltage gain dc-dc converters are key to high-voltage direct current (HVDC) power transmission for offshore wind power. This paper presents an isolated ultra-high step-up dc-dc converter in matrix transformer configuration. A flyback-forward converter is adopted as the power cell and the secondary side matrix connection is introduced to increase the power level and to improve fault tolerance. Because of the modular structure of the converter, the stress on the switching devices is decreased and so is the transformer size. The proposed topology can be operated in column interleaved modes, row interleaved modes, and hybrid working modes in order to deal with the varying energy from the wind farm. Furthermore, fault-tolerant operation is also realized in several fault scenarios. A 400-W dc-dc converter with four cells is developed and experimentally tested to validate the proposed technique, which can be applied to high-power high-voltage dc power transmission.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Waste biomass is generated during the conservation management of semi-natural habitats, and represents an unused resource and potential bioenergy feedstock that does not compete with food production. Thermogravimetric analysis was used to characterise a representative range of biomass generated during conservation management in Wales. Of the biomass types assessed, those dominated by rush (Juncus effuses) and bracken (Pteridium aquilinum) exhibited the highest and lowest volatile compositions respectively and were selected for bench scale conversion via fast pyrolysis. Each biomass type was ensiled and a sub-sample of silage was washed and pressed. Demineralization of conservation biomass through washing and pressing was associated with higher oil yields following fast pyrolysis. The oil yields were within the published range established for the dedicated energy crops miscanthus and willow. In order to examine the potential a multiple output energy system was developed with gross power production estimates following valorisation of the press fluid, char and oil. If used in multi fuel industrial burners the char and oil alone would displace 3.9 × 105 tonnes per year of No. 2 light oil using Welsh biomass from conservation management. Bioenergy and product development using these feedstocks could simultaneously support biodiversity management and displace fossil fuels, thereby reducing GHG emissions. Gross power generation predictions show good potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The medial pFC (mPFC) is frequently reported to play a central role in Theory of Mind (ToM). However, the contribution of this large cortical region in ToM is not well understood. Combining a novel behavioral task with fMRI, we sought to demonstrate functional divisions between dorsal and rostral mPFC. All conditions of the task required the representation of mental states (beliefs and desires). The level of demands on cognitive control (high vs. low) and the nature of the demands on reasoning (deductive vs. abductive) were varied orthogonally between conditions. Activation in dorsal mPFC was modulated by the need for control, whereas rostral mPFC was modulated by reasoning demands. These findings fit with previously suggested domain-general functions for different parts of mPFC and suggest that these functions are recruited selectively in the service of ToM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUCTION: We investigated whether interictal thalamic dysfunction in migraine without aura (MO) patients is a primary determinant or the expression of its functional disconnection from proximal or distal areas along the somatosensory pathway. METHODS: Twenty MO patients and twenty healthy volunteers (HVs) underwent an electroencephalographic (EEG) recording during electrical stimulation of the median nerve at the wrist. We used the functional source separation algorithm to extract four functionally constrained nodes (brainstem, thalamus, primary sensory radial, and primary sensory motor tangential parietal sources) along the somatosensory pathway. Two digital filters (1-400 Hz and 450-750 Hz) were applied in order to extract low- (LFO) and high- frequency (HFO) oscillatory activity from the broadband signal. RESULTS: Compared to HVs, patients presented significantly lower brainstem (BS) and thalamic (Th) HFO activation bilaterally. No difference between the two cortical HFO as well as in LFO peak activations between the two groups was seen. The age of onset of the headache was positively correlated with HFO power in the right brainstem and thalamus. CONCLUSIONS: This study provides evidence for complex dysfunction of brainstem and thalamocortical networks under the control of genetic factors that might act by modulating the severity of migraine phenotype.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The contributions of this dissertation are in the development of two new interrelated approaches to video data compression: (1) A level-refined motion estimation and subband compensation method for the effective motion estimation and motion compensation. (2) A shift-invariant sub-decimation decomposition method in order to overcome the deficiency of the decimation process in estimating motion due to its shift-invariant property of wavelet transform. ^ The enormous data generated by digital videos call for an intense need of efficient video compression techniques to conserve storage space and minimize bandwidth utilization. The main idea of video compression is to reduce the interpixel redundancies inside and between the video frames by applying motion estimation and motion compensation (MEMO) in combination with spatial transform coding. To locate the global minimum of the matching criterion function reasonably, hierarchical motion estimation by coarse to fine resolution refinements using discrete wavelet transform is applied due to its intrinsic multiresolution and scalability natures. ^ Due to the fact that most of the energies are concentrated in the low resolution subbands while decreased in the high resolution subbands, a new approach called level-refined motion estimation and subband compensation (LRSC) method is proposed. It realizes the possible intrablocks in the subbands for lower entropy coding while keeping the low computational loads of motion estimation as the level-refined method, thus to achieve both temporal compression quality and computational simplicity. ^ Since circular convolution is applied in wavelet transform to obtain the decomposed subframes without coefficient expansion, symmetric-extended wavelet transform is designed on the finite length frame signals for more accurate motion estimation without discontinuous boundary distortions. ^ Although wavelet transformed coefficients still contain spatial domain information, motion estimation in wavelet domain is not as straightforward as in spatial domain due to the shift variance property of the decimation process of the wavelet transform. A new approach called sub-decimation decomposition method is proposed, which maintains the motion consistency between the original frame and the decomposed subframes, improving as a consequence the wavelet domain video compressions by shift invariant motion estimation and compensation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Current reform initiatives recommend that school geometry teaching and learning include the study of three-dimensional geometric objects and provide students with opportunities to use spatial abilities in mathematical tasks. Two ways of using Geometer's Sketchpad (GSP), a dynamic and interactive computer program, in conjunction with manipulatives enable students to investigate and explore geometric concepts, especially when used in a constructivist setting. Research on spatial abilities has focused on visual reasoning to improve visualization skills. This dissertation investigated the hypothesis that connecting visual and analytic reasoning may better improve students' spatial visualization abilities as compared to instruction that makes little or no use of the connection of the two. Data were collected using the Purdue Spatial Visualization Tests (PSVT) administered as a pretest and posttest to a control and two experimental groups. Sixty-four 10th grade students in three geometry classrooms participated in the study during 6 weeks. Research questions were answered using statistical procedures. An analysis of covariance was used for a quantitative analysis, whereas a description of students' visual-analytic processing strategies was presented using qualitative methods. The quantitative results indicated that there were significant differences in gender, but not in the group factor. However, when analyzing a sub sample of 33 participants with pretest scores below the 50th percentile, males in one of the experimental groups significantly benefited from the treatment. A review of previous research also indicated that students with low visualization skills benefited more than those with higher visualization skills. The qualitative results showed that girls were more sophisticated in their visual-analytic processing strategies to solve three-dimensional tasks. It is recommended that the teaching and learning of spatial visualization start in the middle school, prior to students' more rigorous mathematics exposure in high school. A duration longer than 6 weeks for treatments in similar future research studies is also recommended.