879 resultados para Particle-based Model


Relevância:

40.00% 40.00%

Publicador:

Resumo:

An augmented reality (AR) device must know observer’s location and orientation, i.e. observer’s pose, to be able to correctly register the virtual content to observer’s view. One possible way to determine and continuously follow-up the pose is model-based visual tracking. It supposes that a 3D model of the surroundings is known and that there is a video camera that is fixed to the device. The pose is tracked by comparing the video camera image to the model. Each new pose estimate is usually based on the previous estimate. However, the first estimate must be found out without a prior estimate, i.e. the tracking must be initialized, which in practice means that some model features must be identified from the image and matched to model features. This is known in literature as model-to-image registration problem or simultaneous pose and correspondence problem. This report reviews visual tracking initialization methods that are suitable for visual tracking in ship building environment when the ship CAD model is available. The environment is complex, which makes the initialization non-trivial. The report has been done as part of MARIN project.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This article discusses, from the standpoint of cellular biology, the deterministic and indeterministic androgenesis theories. The role of the vacuole and of various types of stresses on deviation of the microspore from normal development and the point where androgenetic competence is acquired are examined. Based on extensive literature review and data on wheat studies from our laboratory, a model for androgenetic capacity of pollen grain is proposed. A two point deterministic model for in vitro androgenesis is our proposal for acquisition of androgenetic potential of the pollen grain: the first switch point would be early meiosis and the second switch point the uninucleate pollen stage, because the elimination of cytoplasmatic sporophytic determinants takes place at those two strategic moments. Any abnormality in this process allowing the maintenance of sporophytic informational molecules results in the absence of establishment of a gametophytic program, allowing the reactivation of the embryogenic process

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The objective of this research is to observe the state of customer value management in Outotec Oyj, determine the key development areas and develop a phase model with which to guide the development of a customer value based sales tool. The study was conducted with a constructive research approach with the focus of identifying a problem and developing a solution for the problem. As a basis for the study, the current literature involving customer value assessment and solution and customer value selling was studied. The data was collected by conducting 16 interviews in two rounds within the company and it was analyzed by coding openly. First, seven important development areas were identified, out of which the most critical were “Customer value mindset inside the company” and “Coordination of customer value management activities”. Utilizing these seven areas three functionality requirements, “Preparation”, “Outotec’s value creation and communication” and “Documentation” and three development requirements for a customer value sales tool were identified. The study concluded with the formulation of a phase model for building a customer value based sales tool. The model included five steps that were defined as 1) Enable customer value utilization, 2) Connect with the customer, 3) Create customer value, 4) Define tool to facilitate value selling and 5) Develop sales tool. Further practical activities were also recommended as a guide for executing the phase model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Current therapy for pancreatic cancer is multimodal, involving surgery and chemotherapy. However, development of pancreatic cancer therapies requires a thorough evaluation of drug efficacy in vitro before animal testing and subsequent clinical trials. Compared to two-dimensional culture of cell monolayer, three-dimensional (3-D) models more closely mimic native tissues, since the tumor microenvironment established in 3-D models often plays a significant role in cancer progression and cellular responses to the drugs. Accumulating evidence has highlighted the benefits of 3-D in vitro models of various cancers. In the present study, we have developed a spheroid-based, 3-D culture of pancreatic cancer cell lines MIAPaCa-2 and PANC-1 for pancreatic drug testing, using the acid phosphatase assay. Drug efficacy testing showed that spheroids had much higher drug resistance than monolayers. This model, which is characteristically reproducible and easy and offers rapid handling, is the preferred choice for filling the gap between monolayer cell cultures and in vivo models in the process of drug development and testing for pancreatic cancer.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Wear particles are phagocytosed by macrophages and other inflammatory cells, resulting in cellular activation and release of proinflammatory factors, which cause periprosthetic osteolysis and subsequent aseptic loosening, the most common causes of total joint arthroplasty failure. During this pathological process, tumor necrosis factor-alpha (TNF-α) plays an important role in wear-particle-induced osteolysis. In this study, recombination adenovirus (Ad) vectors carrying both target genes [TNF-α small interfering RNA (TNF-α-siRNA) and bone morphogenetic protein 2 (BMP-2)] were synthesized and transfected into RAW264.7 macrophages and pro-osteoblastic MC3T3-E1 cells, respectively. The target gene BMP-2, expressed on pro-osteoblastic MC3T3-E1 cells and silenced by the TNF-α gene on cells, was treated with titanium (Ti) particles that were assessed by real-time PCR and Western blot. We showed that recombinant adenovirus (Ad-siTNFα-BMP-2) can induce osteoblast differentiation when treated with conditioned medium (CM) containing RAW264.7 macrophages challenged with a combination of Ti particles and Ad-siTNFα-BMP-2 (Ti-ad CM) assessed by alkaline phosphatase activity. The receptor activator of nuclear factor-κB ligand was downregulated in pro-osteoblastic MC3T3-E1 cells treated with Ti-ad CM in comparison with conditioned medium of RAW264.7 macrophages challenged with Ti particles (Ti CM). We suggest that Ad-siTNFα-BMP-2 induced osteoblast differentiation and inhibited osteoclastogenesis on a cell model of a Ti particle-induced inflammatory response, which may provide a novel approach for the treatment of periprosthetic osteolysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The power is still today an issue in wearable computing applications. The aim of the present paper is to raise awareness of the power consumption of wearable computing devices in specific scenarios to be able in the future to design energy efficient wireless sensors for context recognition in wearable computing applications. The approach is based on a hardware study. The objective of this paper is to analyze and compare the total power consumption of three representative wearable computing devices in realistic scenarios such as Display, Speaker, Camera and microphone, Transfer by Wi-Fi, Monitoring outdoor physical activity and Pedometer. A scenario based energy model is also developed. The Samsung Galaxy Nexus I9250 smartphone, the Vuzix M100 Smart Glasses and the SimValley Smartwatch AW-420.RX are the three devices representative of their form factors. The power consumption is measured using PowerTutor, an android energy profiler application with logging option and using unknown parameters so it is adjusted with the USB meter. The result shows that the screen size is the main parameter influencing the power consumption. The power consumption for an identical scenario varies depending on the wearable devices meaning that others components, parameters or processes might impact on the power consumption and further study is needed to explain these variations. This paper also shows that different inputs (touchscreen is more efficient than buttons controls) and outputs (speaker sensor is more efficient than display sensor) impact the energy consumption in different way. This paper gives recommendations to reduce the energy consumption in healthcare wearable computing application using the energy model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper, we review the advances of monocular model-based tracking for last ten years period until 2014. In 2005, Lepetit, et. al, [19] reviewed the status of monocular model based rigid body tracking. Since then, direct 3D tracking has become quite popular research area, but monocular model-based tracking should still not be forgotten. We mainly focus on tracking, which could be applied to aug- mented reality, but also some other applications are covered. Given the wide subject area this paper tries to give a broad view on the research that has been conducted, giving the reader an introduction to the different disciplines that are tightly related to model-based tracking. The work has been conducted by searching through well known academic search databases in a systematic manner, and by selecting certain publications for closer examination. We analyze the results by dividing the found papers into different categories by their way of implementation. The issues which have not yet been solved are discussed. We also discuss on emerging model-based methods such as fusing different types of features and region-based pose estimation which could show the way for future research in this subject.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of this thesis is to focus on credit risk estimation. Different credit risk estimation methods and characteristics of credit risk are discussed. The study is twofold, including an interview of a credit risk specialist and a quantitative section. Quantitative section applies the KMV model to estimate credit risk of 12 sample companies from three different industries: automobile, banking and financial sector and technology. Timeframe of the estimation is one year. On the basis of the KMV model and the interview, implications for analysis of credit risk are discussed. The KMV model yields consistent results with the existing credit ratings. However, banking and financial sector requires calibration of the model due to high leverage of the industry. Credit risk is considerably driven by leverage, value and volatility of assets. Credit risk models produce useful information on credit worthiness of a business. Yet, quantitative models often require qualitative support in the decision-making situation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over time the demand for quantitative portfolio management has increased among financial institutions but there is still a lack of practical tools. In 2008 EDHEC Risk and Asset Management Research Centre conducted a survey of European investment practices. It revealed that the majority of asset or fund management companies, pension funds and institutional investors do not use more sophisticated models to compensate the flaws of the Markowitz mean-variance portfolio optimization. Furthermore, tactical asset allocation managers employ a variety of methods to estimate return and risk of assets, but also need sophisticated portfolio management models to outperform their benchmarks. Recent development in portfolio management suggests that new innovations are slowly gaining ground, but still need to be studied carefully. This thesis tries to provide a practical tactical asset allocation (TAA) application to the Black–Litterman (B–L) approach and unbiased evaluation of B–L models’ qualities. Mean-variance framework, issues related to asset allocation decisions and return forecasting are examined carefully to uncover issues effecting active portfolio management. European fixed income data is employed in an empirical study that tries to reveal whether a B–L model based TAA portfolio is able outperform its strategic benchmark. The tactical asset allocation utilizes Vector Autoregressive (VAR) model to create return forecasts from lagged values of asset classes as well as economic variables. Sample data (31.12.1999–31.12.2012) is divided into two. In-sample data is used for calibrating a strategic portfolio and the out-of-sample period is for testing the tactical portfolio against the strategic benchmark. Results show that B–L model based tactical asset allocation outperforms the benchmark portfolio in terms of risk-adjusted return and mean excess return. The VAR-model is able to pick up the change in investor sentiment and the B–L model adjusts portfolio weights in a controlled manner. TAA portfolio shows promise especially in moderately shifting allocation to more risky assets while market is turning bullish, but without overweighting investments with high beta. Based on findings in thesis, Black–Litterman model offers a good platform for active asset managers to quantify their views on investments and implement their strategies. B–L model shows potential and offers interesting research avenues. However, success of tactical asset allocation is still highly dependent on the quality of input estimates.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study is to propose a stochastic model for commodity markets linked with the Burgers equation from fluid dynamics. We construct a stochastic particles method for commodity markets, in which particles represent market participants. A discontinuity in the model is included through an interacting kernel equal to the Heaviside function and its link with the Burgers equation is given. The Burgers equation and the connection of this model with stochastic differential equations are also studied. Further, based on the law of large numbers, we prove the convergence, for large N, of a system of stochastic differential equations describing the evolution of the prices of N traders to a deterministic partial differential equation of Burgers type. Numerical experiments highlight the success of the new proposal in modeling some commodity markets, and this is confirmed by the ability of the model to reproduce price spikes when their effects occur in a sufficiently long period of time.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this work, the magnetic field penetration depth for high-Tc cuprate superconductors is calculated using a recent Interlayer Pair Tunneling (ILPT) model proposed by Chakravarty, Sudb0, Anderson, and Strong [1] to explain high temperature superconductivity. This model involves a "hopping" of Cooper pairs between layers of the unit cell which acts to amplify the pairing mechanism within the planes themselves. Recent work has shown that this model can account reasonably well for the isotope effect and the dependence of Tc on nonmagnetic in-plane impurities [2] , as well as the Knight shift curves [3] and the presence of a magnetic peak in the neutron scattering intensity [4]. In the latter case, Yin et al. emphasize that the pair tunneling must be the dominant pairing mechanism in the high-Tc cuprates in order to capture the features found in experiments. The goal of this work is to determine whether or not the ILPT model can account for the experimental observations of the magnetic field penetration depth in YBa2Cu307_a7. Calculations are performed in the weak and strong coupling limits, and the efi"ects of both small and large strengths of interlayer pair tunneling are investigated. Furthermore, as a follow up to the penetration depth calculations, both the neutron scattering intensity and the Knight shift are calculated within the ILPT formalism. The aim is to determine if the ILPT model can yield results consistent with experiments performed for these properties. The results for all three thermodynamic properties considered are not consistent with the notion that the interlayer pair tunneling must be the dominate pairing mechanism in these high-Tc cuprate superconductors. Instead, it is found that reasonable agreement with experiments is obtained for small strengths of pair tunneling, and that large pair tunneling yields results which do not resemble those of the experiments.