84 resultados para Portlet-based application


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Object detection is a fundamental task of computer vision that is utilized as a core part in a number of industrial and scientific applications, for example, in robotics, where objects need to be correctly detected and localized prior to being grasped and manipulated. Existing object detectors vary in (i) the amount of supervision they need for training, (ii) the type of a learning method adopted (generative or discriminative) and (iii) the amount of spatial information used in the object model (model-free, using no spatial information in the object model, or model-based, with the explicit spatial model of an object). Although some existing methods report good performance in the detection of certain objects, the results tend to be application specific and no universal method has been found that clearly outperforms all others in all areas. This work proposes a novel generative part-based object detector. The generative learning procedure of the developed method allows learning from positive examples only. The detector is based on finding semantically meaningful parts of the object (i.e. a part detector) that can provide additional information to object location, for example, pose. The object class model, i.e. the appearance of the object parts and their spatial variance, constellation, is explicitly modelled in a fully probabilistic manner. The appearance is based on bio-inspired complex-valued Gabor features that are transformed to part probabilities by an unsupervised Gaussian Mixture Model (GMM). The proposed novel randomized GMM enables learning from only a few training examples. The probabilistic spatial model of the part configurations is constructed with a mixture of 2D Gaussians. The appearance of the parts of the object is learned in an object canonical space that removes geometric variations from the part appearance model. Robustness to pose variations is achieved by object pose quantization, which is more efficient than previously used scale and orientation shifts in the Gabor feature space. Performance of the resulting generative object detector is characterized by high recall with low precision, i.e. the generative detector produces large number of false positive detections. Thus a discriminative classifier is used to prune false positive candidate detections produced by the generative detector improving its precision while keeping high recall. Using only a small number of positive examples, the developed object detector performs comparably to state-of-the-art discriminative methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building a computational model for complex biological systems is an iterative process. It starts from an abstraction of the process and then incorporates more details regarding the specific biochemical reactions which results in the change of the model fit. Meanwhile, the model’s numerical properties such as its numerical fit and validation should be preserved. However, refitting the model after each refinement iteration is computationally expensive resource-wise. There is an alternative approach which ensures the model fit preservation without the need to refit the model after each refinement iteration. And this approach is known as quantitative model refinement. The aim of this thesis is to develop and implement a tool called ModelRef which does the quantitative model refinement automatically. It is both implemented as a stand-alone Java application and as one of Anduril framework components. ModelRef performs data refinement of a model and generates the results in two different well known formats (SBML and CPS formats). The development of this tool successfully reduces the time and resource needed and the errors generated as well by traditional reiteration of the whole model to perform the fitting procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this doctoral thesis, a tomographic STED microscopy technique for 3D super-resolution imaging was developed and utilized to observebone remodeling processes. To improve upon existing methods, wehave used a tomographic approach using a commercially available stimulated emission depletion (STED) microscope. A certain region of interest (ROI) was observed at two oblique angles: one at a standard inverted configuration from below (bottom view) and another from the side (side view) via a micro-mirror positioned close to the ROI. The two viewing angles were reconstructed into a final tomogram. The technique, named as tomographic STED microscopy, was able to achieve an axial resolution of approximately 70 nm on microtubule structures in a fixed biological specimen. High resolution imaging of osteoclasts (OCs) that are actively resorbing bone was achieved by creating an optically transparent coating on a microscope coverglass that imitates a fractured bone surface. 2D super-resolution STED microscopy on the bone layer showed approximately 60 nm of lateral resolution on a resorption associated organelle allowing these structures to be imaged with super-resolution microscopy for the first time. The developed tomographic STED microscopy technique was further applied to study resorption mechanisms of OCs cultured on the bone coating. The technique revealed actin cytoskeleton with specific structures, comet-tails, some of which were facing upwards and some others were facing downwards. This, in our opinion, indicated that during bone resorption, an involvement of the actin cytoskeleton in vesicular exocytosis and endocytosis is present. The application of tomographic STED microscopy in bone biology demonstrated that 3D super-resolution techniques can provide new insights into biological 3D nano-structures that are beyond the diffraction-limit when the optical constraints of super-resolution imaging are carefully taken into account.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global warming is one of the most alarming problems of this century. Initial scepticism concerning its validity is currently dwarfed by the intensification of extreme weather events whilst the gradual arising level of anthropogenic CO2 is pointed out as its main driver. Most of the greenhouse gas (GHG) emissions come from large point sources (heat and power production and industrial processes) and the continued use of fossil fuels requires quick and effective measures to meet the world’s energy demand whilst (at least) stabilizing CO2 atmospheric levels. The framework known as Carbon Capture and Storage (CCS) – or Carbon Capture Utilization and Storage (CCUS) – comprises a portfolio of technologies applicable to large‐scale GHG sources for preventing CO2 from entering the atmosphere. Amongst them, CO2 capture and mineralisation (CCM) presents the highest potential for CO2 sequestration as the predicted carbon storage capacity (as mineral carbonates) far exceeds the estimated levels of the worldwide identified fossil fuel reserves. The work presented in this thesis aims at taking a step forward to the deployment of an energy/cost effective process for simultaneous capture and storage of CO2 in the form of thermodynamically stable and environmentally friendly solid carbonates. R&D work on the process considered here began in 2007 at Åbo Akademi University in Finland. It involves the processing of magnesium silicate minerals with recyclable ammonium salts for extraction of magnesium at ambient pressure and 400‐440⁰C, followed by aqueous precipitation of magnesium in the form of hydroxide, Mg(OH)2, and finally Mg(OH)2 carbonation in a pressurised fluidized bed reactor at ~510⁰C and ~20 bar PCO2 to produce high purity MgCO3. Rock material taken from the Hitura nickel mine, Finland, and serpentinite collected from Bragança, Portugal, were tested for magnesium extraction with both ammonium sulphate and bisulphate (AS and ABS) for determination of optimal operation parameters, primarily: reaction time, reactor type and presence of moisture. Typical efficiencies range from 50 to 80% of magnesium extraction at 350‐450⁰C. In general ABS performs better than AS showing comparable efficiencies at lower temperature and reaction times. The best experimental results so far obtained include 80% magnesium extraction with ABS at 450⁰C in a laboratory scale rotary kiln and 70% Mg(OH)2 carbonation in the PFB at 500⁰C, 20 bar CO2 pressure for 15 minutes. The extraction reaction with ammonium salts is not at all selective towards magnesium. Other elements like iron, nickel, chromium, copper, etc., are also co‐extracted. Their separation, recovery and valorisation are addressed as well and found to be of great importance. The assessment of the exergetic performance of the process was carried out using Aspen Plus® software and pinch analysis technology. The choice of fluxing agent and its recovery method have a decisive sway in the performance of the process: AS is recovered by crystallisation and in general the whole process requires more exergy (2.48–5.09 GJ/tCO2sequestered) than ABS (2.48–4.47 GJ/tCO2sequestered) when ABS is recovered by thermal decomposition. However, the corrosive nature of molten ABS and operational problems inherent to thermal regeneration of ABS prohibit this route. Regeneration of ABS through addition of H2SO4 to AS (followed by crystallisation) results in an overall negative exergy balance (mainly at the expense of low grade heat) but will flood the system with sulphates. Although the ÅA route is still energy intensive, its performance is comparable to conventional CO2 capture methods using alkanolamine solvents. An energy‐neutral process is dependent on the availability and quality of nearby waste heat and economic viability might be achieved with: magnesium extraction and carbonation levels ≥ 90%, the processing of CO2‐containing flue gases (eliminating the expensive capture step) and production of marketable products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrogen (H2) fuel cells have been considered a promising renewable energy source. The recent growth of H2 economy has required highly sensitive, micro-sized and cost-effective H2 sensor for monitoring concentrations and alerting to leakages due to the flammability and explosiveness of H2 Titanium dioxide (TiO2) made by electrochemical anodic oxidation has shown great potential as a H2 sensing material. The aim of this thesis is to develop highly sensitive H2 sensor using anodized TiO2. The sensor enables mass production and integration with microelectronics by preparing the oxide layer on suitable substrate. Morphology, elemental composition, crystal phase, electrical properties and H2 sensing properties of TiO2 nanostructures prepared on Ti foil, Si and SiO2/Si substrates were characterized. Initially, vertically oriented TiO2 nanotubes as the sensing material were obtained by anodizing Ti foil. The morphological properties of tubes could be tailored by varying the applied voltages of the anodization. The transparent oxide layer creates an interference color phenomena with white light illumination on the oxide surface. This coloration effect can be used to predict the morphological properties of the TiO2 nanostructures. The crystal phase transition from amorphous to anatase or rutile, or the mixture of anatase and rutile was observed with varying heat treatment temperatures. However, the H2 sensing properties of TiO2 nanotubes at room temperature were insufficient. H2 sensors using TiO2 nanostructures formed on Si and SiO2/Si substrates were demonstrated. In both cases, a Ti layer deposited on the substrates by a DC magnetron sputtering method was successfully anodized. A mesoporous TiO2 layer obtained on Si by anodization in an aqueous electrolyte at 5°C showed diode behavior, which was influenced by the work function difference of Pt metal electrodes and the oxide layer. The sensor enabled the detection of H2 (20-1000 ppm) at low operating temperatures (50–140°C) in ambient air. A Pd decorated tubular TiO2 layer was prepared on metal electrodes patterned SiO2/Si wafer by anodization in an organic electrolyte at 5°C. The sensor showed significantly enhanced H2 sensing properties, and detected hydrogen in the range of a few ppm with fast response/recovery time. The metal electrodes placed under the oxide layer also enhanced the mechanical tolerance of the sensor. The concept of TiO2 nanostructures on alternative substrates could be a prospect for microelectronic applications and mass production of gas sensors. The gas sensor properties can be further improved by modifying material morphologies and decorating it with catalytic materials.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This document is focused on studying privacy perception and personality traits of users in the context of smartphone application privacy. It is divided into two parts. The first part presents an in depth systematic literature review of the existing academic writings available on the topic of relation between privacy perception and personality traits. Demographics, methodologies and other useful insight is extracted and the available literature is divided into broader group of topics bringing the five main areas of research to light and highlighting the current research trends in the field along with pinpointing the research gap of interest to the author. The second part of the thesis uses the results from the literature review to administer an empirical study to investigate the current privacy perception of users and the correlation between personality traits and privacy perception in smartphone applications. Big five personality test is used as the measure for personality traits whereas three sub-variables are used to measure privacy perception i.e. perceived privacy awareness, perceived threat to privacy and willingness to trade privacy. According to the study openness to experience is the most dominant trait having a strong correlation with two privacy sub-variables whereas emotional stability doesn’t show any correlation with privacy perception. Empirical study also explores other findings as preferred privacy sources and application installation preferences that provide further insight about users and might be useful in future.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study, an infrared thermography based sensor was studied with regard to usability and the accuracy of sensor data as a weld penetration signal in gas metal arc welding. The object of the study was to evaluate a specific sensor type which measures thermography from solidified weld surface. The purpose of the study was to provide expert data for developing a sensor system in adaptive metal active gas (MAG) welding. Welding experiments with considered process variables and recorded thermal profiles were saved to a database for further analysis. To perform the analysis within a reasonable amount of experiments, the process parameter variables were gradually altered by at least 10 %. Later, the effects of process variables on weld penetration and thermography itself were considered. SFS-EN ISO 5817 standard (2014) was applied for classifying the quality of the experiments. As a final step, a neural network was taught based on the experiments. The experiments show that the studied thermography sensor and the neural network can be used for controlling full penetration though they have minor limitations, which are presented in results and discussion. The results are consistent with previous studies and experiments found in the literature.