54 resultados para Test, Black-box testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this master thesis was to perform simulations that involve use of random number while testing hypotheses especially on two samples populations being compared weather by their means, variances or Sharpe ratios. Specifically, we simulated some well known distributions by Matlab and check out the accuracy of an hypothesis testing. Furthermore, we went deeper and check what could happen once the bootstrapping method as described by Effrons is applied on the simulated data. In addition to that, one well known RobustSharpe hypothesis testing stated in the paper of Ledoit and Wolf was applied to measure the statistical significance performance between two investment founds basing on testing weather there is a statistically significant difference between their Sharpe Ratios or not. We collected many literatures about our topic and perform by Matlab many simulated random numbers as possible to put out our purpose; As results we come out with a good understanding that testing are not always accurate; for instance while testing weather two normal distributed random vectors come from the same normal distribution. The Jacque-Berra test for normality showed that for the normal random vector r1 and r2, only 94,7% and 95,7% respectively are coming from normal distribution in contrast 5,3% and 4,3% failed to shown the truth already known; but when we introduce the bootstrapping methods by Effrons while estimating pvalues where the hypothesis decision is based, the accuracy of the test was 100% successful. From the above results the reports showed that bootstrapping methods while testing or estimating some statistics should always considered because at most cases the outcome are accurate and errors are minimized in the computation. Also the RobustSharpe test which is known to use one of the bootstrapping methods, studentised one, were applied first on different simulated data including distribution of many kind and different shape secondly, on real data, Hedge and Mutual funds. The test performed quite well to agree with the existence of statistical significance difference between their Sharpe ratios as described in the paper of Ledoit andWolf.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, a high penetration level of Distributed Generations (DGs) has been observed in the Danish distribution systems, and even more DGs are foreseen to be present in the upcoming years. How to utilize them for maintaining the security of the power supply under the emergency situations, has been of great interest for study. This master project is intended to develop a control architecture for studying purposes of distribution systems with large scale integration of solar power. As part of the EcoGrid EU Smart Grid project, it focuses on the system modelling and simulation of a Danish representative LV network located in Bornholm island. Regarding the control architecture, two types of reactive control techniques are implemented and compare. In addition, a network voltage control based on a tap changer transformer is tested. The optimized results after applying a genetic algorithm to five typical Danish domestic loads are lower power losses and voltage deviation using Q(U) control, specially with large consumptions. Finally, a communication and information exchange system is developed with the objective of regulating the reactive power and thereby, the network voltage remotely and real-time. Validation test of the simulated parameters are performed as well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the latter days, human activities constantly increase greenhouse gases emissions in the atmosphere, which has a direct impact on a global climate warming. Finland as European Union member, developed national structural plan to promote renewable energy generation, pursuing the aspects of Directive 2009/28/EC and put it on the sharepoint. Finland is on a way of enhancing national security of energy supply, increasing diversity of the energy mix. There are plenty significant objectives to develop onshore and offshore wind energy generation in country for a next few decades, as well as another renewable energy sources. To predict the future changes, there are a lot of scenario methods developed and adapted to energy industry. The Master’s thesis explored “Fuzzy cognitive maps” approach in scenarios developing, which captures expert’s knowledge in a graphical manner and using these captures for a raw scenarios testing and refinement. There were prospects of Finnish wind energy development for the year of 2030 considered, with aid of FCM technique. Five positive raw scenarios were developed and three of them tested against integrated expert’s map of knowledge, using graphical simulation. The study provides robust scenarios out of the preliminary defined, as outcome, assuming the impact of results, taken after simulation. The thesis was conducted in such way, that there will be possibilities to use existing knowledge captures from expert panel, to test and deploy different sets of scenarios regarding to Finnish wind energy development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Active Magnetic Bearings offer many advantages that have brought new applications to the industry. However, similarly to all new technology, active magnetic bearings also have downsides and one of those is the low standardization level. This thesis is studying mainly the ISO 14839 standard and more specifically the system verification methods. These verifying methods are conducted using a practical test with an existing active magnetic bearing system. The system is simulated with Matlab using rotor-bearing dynamics toolbox, but this study does not include the exact simulation code or a direct algebra calculation. However, this study provides the proof that standardized simulation methods can be applied in practical problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Kraft pulping process is the dominant chemical pulping process in the world. Roughly 195 million metric tons of black liquor are produced annually as a by-product from the Kraft pulping process. Black liquor consists of spent cooking chemicals and dissolved organics from the wood and can contain up to 0.15 wt% nitrogen on dry solids basis. The cooking chemicals from black liquor are recovered in a chemical recovery cycle. Water is evaporated in the first stage of the chemical recovery cycle, so the black liquor has a dry solids content of 65-85% prior to combustion. During combustion of black liquor, a portion of the black liquor nitrogen is volatilized, finally forming N2 or NO. The rest of the nitrogen remains in the char as char nitrogen. During char conversion, fixed carbon is burned off leaving the pulping chemicals as smelt, and the char nitrogen forms mostly smelt nitrogen (cyanate, OCN-). Smelt exits the recovery boiler and is dissolved in water. The cyanate from smelt decomposes in the presence of water, forming NH3, which causes nitrogen emissions from the rest of the chemical recovery cycle. This thesis had two focuses: firstly, to determine how the nitrogen chemistry in the recovery boiler is affected by modification of black liquor; and secondly, to find out what causes cyanate formation during thermal conversion, and which parameters affect cyanate formation and decomposition during thermal conversion of black liquor. The fate of added biosludge nitrogen in chemical recovery was determined in Paper I. The added biosludge increased the nitrogen content of black liquor. At the pulp mill, the added biosludge did not increase the NO formation in the recovery boiler, but instead increased the amount of cyanate in green liquor. The increased cyanate caused more NH3 formation, which increased the NCG boiler’s NO emissions. Laboratory-scale experiments showed an increase in both NO and cyanate formation after biosludge addition. Black liquor can be modified, for example by addition of a solid biomass to increase the energy density of black liquor, or by separation of lignin from black liquor by precipitation. The precipitated lignin can be utilized in the production of green chemicals or as a fuel. In Papers II and III, laboratory-scale experiments were conducted to determine the impact of black liquor modification on NO and cyanate formation. Removal of lignin from black liquor reduced the nitrogen content of the black liquor. In most cases NO and cyanate formation decreased with increasing lignin removal; the exception was NO formation from lignin lean soda liquors. The addition of biomass to black liquor resulted in a higher nitrogen content fuel mixture, due to the higher nitrogen content of biomass compared to black liquor. More NO and cyanate were formed from the fuel mixtures than from pure black liquor. The increased amount of formed cyanate led to the hypothesis that black liquor is catalytically active and converts a portion of the nitrogen in the mixed fuel to cyanate. The mechanism behind cyanate formation during thermal conversion of black liquor was not clear before this thesis. Paper IV studies the cyanate formation of alkali metal loaded fuels during gasification in a CO2 atmosphere. The salts K2CO3, Na2CO3, and K2SO4 all promoted char nitrogen to cyanate conversion during gasification, while KCl and CaCO3 did not. It is now assumed that cyanate is formed when alkali metal carbonate or an active intermediate of alkali metal carbonate (e.g. -CO2K) reacts with the char nitrogen forming cyanate. By testing different fuels (bark, peat, and coal), each of which had a different form of organic nitrogen, it was concluded that the form of organic nitrogen in char also has an impact on cyanate formation. Cyanate can be formed during pyrolysis of black liquor, but at temperatures 900°C or above, the formed cyanate will decompose. Cyanate formation in gasifying conditions with different levels of CO2 in the atmosphere was also studied. Most of the char nitrogen was converted to cyanate during gasification at 800-900°C in 13-50% CO2 in N2, and only 5% of the initial fuel nitrogen was converted to NO during char conversion. The formed smelt cyanate was stable at 800°C 13% CO2, while it decomposed at 900°C 13% CO2. The cyanate decomposition was faster at higher temperatures and in oxygen-containing atmospheres than in an inert atmosphere. The presence of CO2 in oxygencontaining atmospheres slowed down the decomposition of cyanate. This work will provide new information on how modification of black liquor affects the nitrogen chemistry during thermal conversion of black liquor and what causes cyanate formation during thermal conversion of black liquor. The formation and decomposition of cyanate was studied in order to provide new data, which would be useful in modeling of nitrogen chemistry in the recovery boiler.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study is to test the accrual-based model suggested by Dechow et al. (1995) in order to detect and compare earnings management practices in Finnish and French companies. Also the impact of financial crisis of 2008 on earnings management behavior in these countries is tested by dividing the whole time period of 2003-2012 into two sub-periods: pre-crisis (2003-2008) and post-crisis (2009-2012). Results support the idea that companies in both countries have significant earnings management practices. During the post-crisis period companies in Finland show income inflating practices, while in France the opposite tendency is noticed (income deflating) during the same period. Results of the assumption that managers in highly concentrated companies are engaged in income enhancing practices vary in two countries. While in Finland managers are trying to show better performance for bonuses or other contractual compensation motivations, in France they avoid paying dividends or high taxes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Software is a key component in many of our devices and products that we use every day. Most customers demand not only that their devices should function as expected but also that the software should be of high quality, reliable, fault tolerant, efficient, etc. In short, it is not enough that a calculator gives the correct result of a calculation, we want the result instantly, in the right form, with minimal use of battery, etc. One of the key aspects for succeeding in today's industry is delivering high quality. In most software development projects, high-quality software is achieved by rigorous testing and good quality assurance practices. However, today, customers are asking for these high quality software products at an ever-increasing pace. This leaves the companies with less time for development. Software testing is an expensive activity, because it requires much manual work. Testing, debugging, and verification are estimated to consume 50 to 75 per cent of the total development cost of complex software projects. Further, the most expensive software defects are those which have to be fixed after the product is released. One of the main challenges in software development is reducing the associated cost and time of software testing without sacrificing the quality of the developed software. It is often not enough to only demonstrate that a piece of software is functioning correctly. Usually, many other aspects of the software, such as performance, security, scalability, usability, etc., need also to be verified. Testing these aspects of the software is traditionally referred to as nonfunctional testing. One of the major challenges with non-functional testing is that it is usually carried out at the end of the software development process when most of the functionality is implemented. This is due to the fact that non-functional aspects, such as performance or security, apply to the software as a whole. In this thesis, we study the use of model-based testing. We present approaches to automatically generate tests from behavioral models for solving some of these challenges. We show that model-based testing is not only applicable to functional testing but also to non-functional testing. In its simplest form, performance testing is performed by executing multiple test sequences at once while observing the software in terms of responsiveness and stability, rather than the output. The main contribution of the thesis is a coherent model-based testing approach for testing functional and performance related issues in software systems. We show how we go from system models, expressed in the Unified Modeling Language, to test cases and back to models again. The system requirements are traced throughout the entire testing process. Requirements traceability facilitates finding faults in the design and implementation of the software. In the research field of model-based testing, many new proposed approaches suffer from poor or the lack of tool support. Therefore, the second contribution of this thesis is proper tool support for the proposed approach that is integrated with leading industry tools. We o er independent tools, tools that are integrated with other industry leading tools, and complete tool-chains when necessary. Many model-based testing approaches proposed by the research community suffer from poor empirical validation in an industrial context. In order to demonstrate the applicability of our proposed approach, we apply our research to several systems, including industrial ones.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Point-of-care (POC) –diagnostics is a field with rapidly growing market share. As these applications become more widely used, there is an increasing pressure to improve their performance to match the one of a central laboratory tests. Lanthanide luminescence has been widely utilized in diagnostics because of the numerous advantages gained by the utilization of time-resolved or anti-Stokes detection. So far the use of lanthanide labels in POC has been scarce due to limitations set by the instrumentation required for their detection and the shortcomings, e.g. low brightness, of these labels. Along with the advances in the research of lanthanide luminescence, and in the field of semiconductors, these materials are becoming a feasible alternative for the signal generation also in the future POC assays. The aim of this thesis was to explore ways of utilizing time-resolved detection or anti-Stokes detection in POC applications. The long-lived fluorescence for the time-resolved measurement can be produced with lanthanide chelates. The ultraviolet (UV) excitation required by these chelates is cumbersome to produce with POC compatible fluorescence readers. In this thesis the use of a novel light-harvesting ligand was studied. This molecule can be used to excite Eu(III)-ions at wavelengths extending up to visible part of the spectrum. An enhancement solution based on this ligand showed a good performance in a proof-of-concept -bioaffinity assay and produced a bright signal upon 365 nm excitation thanks to the high molar absorptivity of the chelate. These features are crucial when developing miniaturized readers for the time-resolved detection of fluorescence. Upconverting phosphors (UCPs) were studied as an internal light source in glucose-sensing dry chemistry test strips and ways of utilizing their various emission wavelengths and near-infrared excitation were explored. The use of nanosized NaYF :Yb3+,Tm3+-particles enabled the replacement of an external UV-light source with a NIR-laser and gave an additional degree of freedom in the optical setup of the detector instrument. The new method enabled a blood glucose measurement with results comparable to a current standard method of measuring reflectance. Microsized visible emitting UCPs were used in a similar manner, but with a broad absorbing indicator compound filtering the excitation and emission wavelengths of the UCP. This approach resulted in a novel way of benefitting from the non-linear relationship between the excitation power and emission intensity of the UCPs, and enabled the amplification of the signal response from the indicator dye.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis discusses the basic problem of the modern portfolio theory about how to optimise the perfect allocation for an investment portfolio. The theory provides a solution for an efficient portfolio, which minimises the risk of the portfolio with respect to the expected return. A central feature for all the portfolios on the efficient frontier is that the investor needs to provide the expected return for each asset. Market anomalies are persistent patterns seen in the financial markets, which cannot be explained with the current asset pricing theory. The goal of this thesis is to study whether these anomalies can be observed among different asset classes. Finally, if persistent patterns are found, it is investigated whether the anomalies hold valuable information for determining the expected returns used in the portfolio optimization Market anomalies and investment strategies based on them are studied with a rolling estimation window, where the return for the following period is always based on historical information. This is also crucial when rebalancing the portfolio. The anomalies investigated within this thesis are value, momentum, reversal, and idiosyncratic volatility. The research data includes price series of country level stock indices, government bonds, currencies, and commodities. The modern portfolio theory and the views given by the anomalies are combined by utilising the Black-Litterman model. This makes it possible to optimise the portfolio so that investor’s views are taken into account. When constructing the portfolios, the goal is to maximise the Sharpe ratio. Significance of the results is studied by assessing if the strategy yields excess returns in a relation to those explained by the threefactormodel. The most outstanding finding is that anomaly based factors include valuable information to enhance efficient portfolio diversification. When the highest Sharpe ratios for each asset class are picked from the test factors and applied to the Black−Litterman model, the final portfolio results in superior riskreturn combination. The highest Sharpe ratios are provided by momentum strategy for stocks and long-term reversal for the rest of the asset classes. Additionally, a strategy based on the value effect was highly appealing, and it basically performs as well as the previously mentioned Sharpe strategy. When studying the anomalies, it is found, that 12-month momentum is the strongest effect, especially for stock indices. In addition, a high idiosyncratic volatility seems to be positively correlated with country indices on stocks.